Download as pdf or txt
Download as pdf or txt
You are on page 1of 36

KCA031

Privacy and Security in Online


Social Media
Unit-2
Trust Management in OSN
Unit-2
Trust Management in Online Social Networks:
• Trust and Policies
• Trust and Reputation Systems
• Trust in Online Social Media
• Trust Properties, Trust Components
• Social Trust and Social Capital
• Trust Evaluation Models
• Trust, credibility, and reputations in social systems
• Online social media and Policing
• Information privacy disclosure
• Revelation, and its effects in OSM and online social networks;
• Phishing in OSM & Identifying fraudulent entities in online social networks
What is Trust?
Definition: Trust in online social networks refers to the belief or confidence that users have
in the reliability, credibility, and integrity of other users, content, or platforms within the
network.
According to Wikipedia
“Trust is the belief that another person will do what is expected. It brings with it a
willingness for one party (the trustor) to become vulnerable to another party (the trustee), on
the presumption that the trustee will act in ways that benefit the trustor”
Types of Trust
• Implicit Trust: Trust that is based on assumptions, intuition, or instinct, often without
explicit evidence or rational analysis. It arises from familiarity, social norms, and
subjective impressions.
• Explicit Trust: Trust that is consciously assessed and consciously granted based on
explicit evidence, information, or reasoning. It involves a deliberate evaluation of
trustworthiness.
Importance of Trust
• Building Relationships: Trust is crucial for fostering positive relationships among users
within the online social network. It enhances collaboration, communication, and
cooperation.
• User Engagement: Trustworthy networks are more likely to attract and retain users,
leading to increased engagement, user-generated content, and overall activity levels.
• Platform Credibility: Effective trust management enhances the credibility and reputation
of the platform, which is essential for attracting advertisers, investors, and stakeholders.
• Privacy and Security: Trust management mechanisms help mitigate risks related to
privacy violations, data breaches, and malicious activities, thereby safeguarding users'
personal information and ensuring a secure environment.
• Mitigating Misinformation: By promoting trustworthy content and users while
minimizing the spread of misinformation and fake news, trust management contributes to
the integrity and reliability of the network.
Factors influencing Trust
• Reputation: The perceived credibility and reliability of an individual or entity based on
past behavior, interactions, and feedback from others.
• Relationships: The strength and depth of social connections between users, including
friendships, mutual acquaintances, and shared interests.
• Consistency: The degree to which an individual or entity behaves predictably and reliably
over time, fostering trust through reliability.
• Transparency: Openness and honesty in communication and actions, including clear
intentions, disclosures, and accountability.
• Competence: Demonstrated skills, expertise, and knowledge in a particular domain,
contributing to trustworthiness.
• Similarity: Shared values, beliefs, or characteristics between individuals can enhance trust
by fostering a sense of familiarity and understanding.
What is Policy?
Definition: Policies in online social networks refer to rules, guidelines, and principles
established by platform administrators or governing bodies to regulate user behavior,
protect user rights, and govern interactions within the network.

Purpose: Policies serve to establish boundaries, promote fairness, ensure compliance with
legal and ethical standards, and maintain the integrity and security of the online social
network ecosystem.

Note: Policies play a crucial role in trust management within online social networks by
establishing guidelines, standards, and expectations for user behavior and platform
governance. By implementing and enforcing effective policies, platforms can build trust,
protect user rights, and create a safe, secure, and enjoyable environment for their users.
Role of Policies in Trust Management
• Establishing Trustworthiness: Policies define expectations and standards of behavior,
helping to create a trustworthy environment by setting clear guidelines for users to follow.
• Protecting Privacy and Security: Policies related to privacy, data sharing, and security
protect user data, mitigate risks of unauthorized access or misuse, and foster trust by
demonstrating a commitment to user privacy and security.
• Managing Content and Community Standards: Content moderation policies ensure that
user-generated content meets community standards, reduces the spread of harmful or
inappropriate content, and maintains the quality and integrity of the platform.
• Fostering Transparency and Accountability: Policies promote transparency by outlining
platform rules, procedures, and enforcement mechanisms, holding users and
administrators accountable for their actions and decisions.
• Enhancing User Experience: Well-designed policies contribute to a positive user
experience by promoting a safe, respectful, and inclusive environment, encouraging user
engagement and trust.
Example of Policies
• Privacy Policies: Guidelines outlining how user data is collected, stored, used, and shared,
including provisions for user consent, data protection, and compliance with privacy
regulations (e.g., GDPR, CCPA).
• Data Sharing Policies: Rules governing the sharing, dissemination, and access to user data by
third parties, ensuring transparency, user control, and protection of sensitive information.
• Content Moderation Policies: Standards and guidelines for acceptable content, behavior, and
interactions within the online social network, including prohibitions on hate speech,
harassment, misinformation, and other harmful content.
• Community Guidelines: Rules of conduct that outline expected behavior, etiquette, and
community norms for users, promoting civility, respect, and inclusivity within the online
social network community.
• Terms of Service: Legal agreements between the platform and users that establish rights,
responsibilities, and obligations, including provisions related to user rights, platform usage,
and dispute resolution.
What is Reputation?
Definition - Reputation is the collective evaluation or perception of an individual's or
entity's behavior, character, and performance based on past actions, experiences, and
feedback from others. Reputation serves as a valuable social currency, influencing how
others perceive and interact with an individual or entity within a community or network. In
online environments, reputation is often represented by numerical ratings, reviews, or
feedback provided by other users, which reflect the trustworthiness, reliability, and quality
of the individual or entity's actions or contributions.
According to Wikipedia
“The reputation or prestige of a social entity (a person, a social group, an organization, or
a place) is an opinion about that entity – typically developed as a result of social evaluation
on a set of criteria, such as behavior or performance”
Trust and reputation are interconnected in several ways:
• Trust Assessment: Reputation serves as a key input for trust assessment. When evaluating
whether to trust a particular individual or entity, users often rely on reputation signals, such as
ratings, reviews, or endorsements, as indicators of trustworthiness. Positive reputation can instill
confidence and reinforce trust, while negative reputation may raise red flags and lead to distrust or
caution.
• Trust Building: Reputation contributes to trust building by providing social proof and credibility.
Individuals or entities with a strong and positive reputation are more likely to be trusted by others,
as their track record of reliability and integrity enhances their perceived trustworthiness. Positive
interactions and consistent performance over time can further strengthen trust and reputation.
• Trust Maintenance: Reputation serves as a mechanism for maintaining trust over time.
Continuous feedback and reputation updates enable users to monitor and reassess the
trustworthiness of others based on their ongoing behavior and performance. Changes in
reputation, whether positive or negative, can influence trust dynamics and impact future
interactions within the network.
• Trust Feedback Loop: Trust and reputation form a feedback loop, where trust influences
reputation, and reputation, in turn, influences trust. Positive experiences and trustworthy behavior
enhance reputation, which reinforces trust and encourages further cooperation and engagement.
Conversely, breaches of trust or negative experiences can damage reputation, leading to
diminished trust and potentially isolating the individual or entity within the network.
Social Trust and Social Capital
What is Social Trust & Social Capital
Social Trust:
• Social trust refers to the level of trust and confidence that individuals have in the
reliability, honesty, and integrity of others within a society or community. It reflects the
belief that people generally adhere to social norms, fulfill their obligations, and behave in
trustworthy ways. Social trust is essential for fostering cooperation, collaboration, and
collective action within communities, as it reduces uncertainty and transaction costs
associated with interpersonal interactions. High levels of social trust are associated with
positive outcomes, such as economic development, political stability, and social cohesion,
while low levels of social trust can lead to social fragmentation, conflict, and dysfunction.
Social Capital:
• Social capital refers to the resources, relationships, and networks embedded within social
structures that facilitate cooperation, reciprocity, and mutual support among individuals
and groups. It encompasses both the tangible and intangible benefits derived from social
connections, including trust, norms of reciprocity, shared values, and collective action
Social Trust & Social Capital
Social trust and social capital are closely interconnected concepts that reinforce each other
within social networks and communities:
• Social trust serves as a foundation for building social capital, as trust is essential for
establishing and maintaining cooperative relationships, reciprocity, and mutual aid within
social networks.
• Social capital, in turn, contributes to the cultivation of social trust by fostering strong
social ties, shared norms, and collective identities that promote trust, cooperation, and
solidarity among individuals and groups.
• Together, social trust and social capital create a virtuous cycle of mutual reinforcement,
where trust strengthens social connections and networks, while social capital enhances
trust and cooperation, leading to positive outcomes for individuals, communities, and
societies as a whole.
Trust Evaluation Models
Trust evaluation models are computational frameworks used to assess and quantify the
trustworthiness of entities, such as individuals, organizations, or pieces of content, within a
networked environment. These models aim to provide automated mechanisms for making
trust-related decisions based on available information and past interactions. Here are some
common trust evaluation models:

1. User-based Trust Metrics: 3. Social Network Analysis (SNA):


• Reputation-based Models • Friendship-based Trust
• Advogato Trust Metric • Group-based Trust
• EigenTrust 4. Hybrid Models:
2. Content-based Trust Analysis: • Hybrid Trust Networks
• Web-based Models 5. Machine Learning-based Models
• Google PageRank 6. Context-aware Models:
• Web of Trust (WOT) • Contextual Trust Models
User-based Trust Metrics
Reputation-based Models: Reputation-based models focus on assessing the trustworthiness of
individual users based on their reputation within the community. Users with a positive history of
interactions, contributions, and feedback from peers are considered more trustworthy. These models
often rely on feedback mechanisms such as ratings, reviews, endorsements, and reputation scores to
quantify trust.
• Advogato Trust Metric: Developed for the Advogato online community, the Advogato Trust
Metric is a specific implementation of a reputation-based model. It assigns trust levels to users
based on their contributions to open-source software projects and endorsements from other trusted
users. The metric incorporates a recursive algorithm to propagate trust through the network.
• EigenTrust: EigenTrust is a trust evaluation algorithm used in peer-to-peer networks. It computes
global trust values for each peer based on iterative calculations using the eigenvector centrality of
the trust graph. Peers with higher EigenTrust values are considered more trustworthy, and their
recommendations carry more weight in the network.
Content-based Trust Analysis
Web-based Models: Web-based trust analysis focuses on evaluating the credibility and reliability of
online content, such as web pages and websites. Factors such as the source of the content, the quality
of writing, the presence of citations or references, and the absence of misinformation or spam are
considered indicators of trustworthiness.
• Google PageRank: Google PageRank is an algorithm used to rank web pages in search engine
results based on the quantity and quality of incoming links. Pages with a higher number of
authoritative inbound links are considered more trustworthy and rank higher in search results.
• Web of Trust (WOT): Web of Trust (WOT) is a browser extension and community-driven trust
evaluation system that rates the trustworthiness of websites based on user feedback and ratings. It
provides warnings about potentially unsafe or unreliable websites, helping users make informed
browsing decisions.
Social Network Analysis (SNA)
• Friendship-based Trust: Friendship-based trust models assess trustworthiness based on direct
social connections between users within a social network. Users are more likely to trust their
friends and connections, and the strength and duration of these friendships influence trust
assessments.
• Group-based Trust: Group-based trust models evaluate trust within specific social groups or
communities. Trust is derived from collective behavior, consensus among group members, and
shared values or norms. These models consider factors such as group cohesion, collaboration
dynamics, and the influence of group leaders on trust assessments.
Hybrid Models, Machine Learning-based
Models and Context Trust Models
• Hybrid Trust Networks: Hybrid trust networks combine multiple trust evaluation approaches and
techniques to provide comprehensive trust assessments within online communities. By integrating
user-based, content-based, and network-based analyses, hybrid models offer a more nuanced
understanding of trust dynamics and enhance the accuracy of trust evaluations.
• Machine learning-based trust evaluation models use algorithms and statistical techniques to
analyze large datasets and make predictions about trustworthiness. These models learn from
labeled data to classify users or content as trustworthy or untrustworthy, incorporating features
such as user behavior, content characteristics, and social network structure.
• Contextual Trust Models: Contextual trust models consider the context in which trust decisions
are made, adapting trust assessments to specific situations, environments, and user preferences. By
incorporating situational factors, temporal trends, cultural norms, and individual preferences, these
models provide more personalized and adaptive trust evaluations tailored to the needs of individual
users.
Trust, Credibility, and Reputations
in
Social Systems
TRUST CREDEBILITY REPUTATION
Trust refers to the belief or Credibility relates to the perceived Reputation encompasses the collective
confidence that one party (the believability, trustworthiness, and perceptions, evaluations, and opinions that
trustor) has in another party (the reliability of information, sources, or others hold about an individual, organization,
trustee) to act in a reliable, individuals. or entity based on their past behavior, actions,
competent, and ethical manner. and interactions.

Trust is essential for fostering In social systems, credibility is crucial A good reputation is valuable, as it enhances
cooperation, collaboration, and for establishing authority, trust, credibility, and social capital, facilitating
mutual benefit in social persuasiveness, and influence. cooperation, opportunities, and positive
interactions, as it reduces Credible individuals or sources are outcomes.
uncertainty and risk. more likely to be trusted and Reputation can be built or damaged through
respected by others. consistent behavior, ethical conduct, quality of
work, responsiveness to feedback, and how
one treats others.
Trust can be built over time Factors influencing credibility include In social systems, reputation serves as a form
through consistent behavior, expertise, experience, track record, of social currency, influencing social status,
transparency, reliability, and consistency, transparency, and relationships, and decision-making processes.
integrity. However, it can also be alignment with values or interests.
fragile and easily damaged by
betrayal, dishonesty, or breaches of
trust.
Information Privacy Disclosure
Information Privacy Disclosure refers to the process of informing individuals about
how their personal information is collected, used, disclosed, and managed by an
organization or entity. It involves transparency and communication regarding the
practices and policies that govern the handling of personal data. Information privacy
disclosure is a fundamental aspect of privacy laws and regulations, as it empowers
individuals to make informed decisions about the sharing of their personal information
and allows them to exercise control over their privacy.
Key Components of Information Privacy Disclosures:
1. Data Collection Practices
2. Purpose of Data use
3. Data sharing and Disclosure
4. Data Security Measures
5. Individual Rights and Choices
6. Privacy Policies and Notices
“Revelation" refers to the intentional or unintentional disclosure of personal
information by users. This disclosure can occur through various means, such as sharing
posts, comments, photos, or other content that reveals details about oneself, including
personal experiences, preferences, relationships, or identifiable information.

Revelation effects in OSM:


1. Privacy Risks
2. Identity Formation
3. Social Interaction
4. Information Exchange
5. Algorithm Influence
6. Trust and Authenticity
Phishing in OSM
Types of Phishing Attacks
• Email Phishing
• Spear Phishing
• Whaling
• Business Email Compromise (BEC)
• Voice Phishing (Vishing)
• HTTPS Phishing
• Clone Phishing
• SMS Phishing (Smishing)
• Pop-Up Phishing
• Evil Twin Phishing
Identifying fraudulent entities in OSN
Identifying fraudulent entities in OSN involves recognizing and mitigating various forms of
deceptive behavior and malicious activities perpetrated by individuals or entities posing as
legitimate users or organizations. Some strategies for detecting and addressing phishing and
fraudulent entities in OSNs:
1. Educating Users: Promote awareness and education initiatives to help users recognize
common phishing tactics, such as fake profiles, misleading messages, and suspicious
links or attachments. Provide guidance on verifying the authenticity of communications
and avoiding interactions with unfamiliar or suspicious accounts.
2. Monitoring for Anomalies: Implement monitoring systems and algorithms to detect
unusual patterns of activity, such as sudden spikes in friend requests, messages, or
account logins, which may indicate fraudulent behavior or coordinated phishing
campaigns. Monitor for account impersonation, account hijacking, or suspicious changes
to profile information.
3. Verification and Authentication: Introduce verification mechanisms, such as blue
checkmarks or badges, to authenticate the identities of legitimate organizations, public
figures, or high-profile users on OSNs. Encourage users to verify the authenticity of
accounts by cross-referencing information with official websites or contacting trusted
sources directly.
4. Reporting and Response Mechanisms: Establish clear procedures and channels for
users to report suspicious or fraudulent activity, including phishing attempts, fake
profiles, or scam messages. Empower users to flag or report suspicious content,
accounts, or interactions for review and action by platform administrators or
moderators.
5. Enhanced Security Features: Enhance security features and privacy settings on OSNs
to mitigate the risk of phishing attacks and unauthorized access to user accounts.
Implement two-factor authentication (2FA), account recovery options, and security
alerts to help users protect their accounts and prevent unauthorized access.
6. Collaboration with Authorities: Collaborate with law enforcement agencies,
cybersecurity experts, and industry partners to investigate and mitigate phishing attacks,
fraudulent schemes, and online scams targeting users of online social networks. Share
threat intelligence, best practices, and resources to enhance collective resilience against
fraudulent activities.

You might also like