Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 17

Chapter-11

Data Privacy and Risks


Prepared by
Lec. Kashfia Maisha
Dark Side of Big Data

As the influence of the big data phenomenon grows increasingly widespread,


with an increasing number of organizations relying on big data analytics for the
fulfillment of their daily operations- big data has revolutionized the digital
landscape.
When it comes to running a digital enterprise, or an e-business amidst the
complex threat landscape of today, the valuable insights generated by big data
play a crucial part in aiding businesses to flourish. Some aspects of running a
business in which big data insights play a crucial role include predicting the most
popular items for the season, along with providing businesses the basic
framework to improve their brand-consumer relationship from.
Dark Side of Big Data
As more and more companies amalgamate the insights generated by big data
into the backbone of their business, and as a growing number of organizations
harness the power of big data to make their enterprise stand out from the
competition, most enterprise owners and security experts tend to willfully ignore
the ‘dark’ side of big data.
As ominous as the phrase ‘the dark side’ sounds, perhaps even darker is the
multitude of threats and vulnerabilities associated with big data that often get
pushed under the rug. With the power harnessed by these big-data insights
growing at a rapid pace, the privacy issues associated with big data also catapult
to the top of the ‘security and privacy concerns’ totem pole.
While the IT landscape brinks on the edge of yet another breach or vulnerability,
it is highly important that enterprise owners grasp the reality of the big data
situation, and instead of viewing big data as this perfect phenomenon and come
to terms with the multiple privacy loopholes present.
Risks of Big Data
1. Obstruction of Privacy Through Breaches: In the modern digital landscape of
today, where phenomenon such as the “filter bubble,” and “personalized marketing”
are on the rise, many individuals fear that they live with their privacy, particularly
their online privacy in a state of constant decline. Although that might be true to a
certain extent, if the fragile state of cybersecurity continues the path that it is today,
our future might just ring true of an Orwellian nightmare.
And if that wasn’t enough, the magnitude of the situation gets amplified further
when we account for the privacy breaches that are made possible through big data
insights and analyses. Typically, a huge portion of big data insights consists of
predictions being made regarding the customer’s details. Oftentimes, these details
tend to be extremely personal in nature, which is why even the mere chance of
them falling into the wrong hands, is enough to eradicate any trust that individuals
have in the organization. We should take cues from security-centric organizations,
which are typically ahead of the curve.
Risks of Big Data
2. It Becomes Near-Possible to Achieve Anonymity: As organizations employ big data
analytics, even the mere notion of having anonymized data files becomes
impossible. Since big data insights are based on a wide variety of raw data sets,
there is a high possibility that consumers could have their identification factors
exposed, which eradicates any semblance of privacy that the individual might have.
Moreover, in the rare instance that a data file is made to be completely
‘anonymized,’ several security teams combine these valued files with others, to
make the process of identifying an individual quite easy. Anonymization is further
complicated by the fact that nearly every SME that does business online relies on
finance, invoicing, and accounting software hosted by third parties in the cloud,
many of whom have varying data privacy practices. In addition to the already
meager privacy that individuals are given, accounting for the interconnectedness
amongst devices, much thanks to the ever-expanding IoT that houses millions of
smart gadgets, the notion of obtaining privacy becomes even more obscure.
Risks of Big Data
3. Data Masking Met With Failure in a Big Data-Driven Setting: In an attempt to protect their
confidential information from hackers and cybercriminals, most organizations utilize the procedure
of ‘Data masking.’ As its name suggests, data masking, also referred to as data obfuscation, is the
process through which real data is hidden by other less important characters or data sets. Typically,
data masking is deployed to veil sensitive information from unauthorized individuals.
In most enterprises, the primary function served through data masking is the protection of
confidential data from ending up in the wrong hands, which does not necessarily ring true for big-
data driven settings. This is particularly true for software enterprises, which rely on software-as-a-
service (SaaS) marketing agencies to grow. SaaS data aggregation is particularly susceptible to data
breaches, as most of the customer data now lives in the cloud. If not utilized properly, data masking
could result in complete failure, compromising the security, and subsequently, the privacy of
multiple individuals through big data analytics.
The only solution to the conundrum posed by the functioning of big data alongside data masking is
for companies to establish a stringent policy that lays out the rules for data masking, along with
ensuring that those rules are followed by each employee.
Risks of Big Data
4. Copyrights and Patents Are Rendered Irrelevant: Another issue that renders the
inclusion of big data within organizations painstaking, is the fact that in a big data-
driven setting, obtaining patents becomes extremely challenging. One of the primary
reasons behind the difficulty in obtaining patents in a big data environment is that it
takes an excruciatingly long amount of time to verify the uniqueness of the patent,
amidst a monumental database of information available.
Furthermore, in a big data environment, copyrights are rendered to be irrelevant,
since big data makes the manipulation of data highly possible, which also sends the
royalties associated with the invention of something original, tumbling into dust.
5. Discrimination Issues: As humanity takes on a new turn and welcomes with open
arms the advent of a digital age, one might assume that we’d leave racism and
blatant discrimination in the past, but unfortunately, they are still real issues that
continue to wreak damage.
Risks of Big Data
Although discrimination exists in almost all sectors and industries, with the inclusion
of big data insights and analytics, companies can now find out the race of an
individual and leverage their piece of information against them. For instance, if a
person were to apply for a bank loan, through predictive analytics, the company
could find out the person’s race, and reject them on that basis- which is now a
named phenomenon known as “automated discrimination.”
6. Big Data Analysis Isn’t Completely Accurate: Perhaps the surprising issue seen
with big data, is that contrary to popular belief, the analysis generated by big data
isn’t as accurate as we previously thought it to be. Although the insights formulated
by big data are powerful, they can also be critically flawed at times, further
contributing to the privacy issues we’ve mentioned so far.
Risks of Big Data
Typically, an inaccurate big data analysis is rooted primarily in flawed algorithms,
incorrect data models, along with misplaced data about individuals. Not only does
running of a poorly-done big data diagnosis contribute to the lack of validation for
data, but it is also a source of direct harm for consumers since it results in the loss of
jobs, false misdiagnosis, and the denial of essential services.
The reality of the “Big Data Situation,” comes along with the plethora of privacy
issues that arise in a big data-driven setting. Having said that, we can only hope that
the privacy concerns we’ve mentioned are prioritized by organizations, who
hopefully take a more active approach in resolving these concerns.
Control Measures
1
Seven Global Privacy Principles
The truth is that most companies develop their own privacy policies as a matter of
establishing a modicum of “trust” with consumers. There are several variations of
seven principles outlined in the “EU-US Safe Harbor Principles,” which most
companies have engrained into their self-regulation for data privacy:
1. Notice (Transparency): Inform individuals about the purposes for which
information is collected
2. Choice: Offer individuals the opportunity to choose (or opt out) whether and how
personal information they provide is used or disclosed
3. Consent: Only disclose personal data information to third parties consistent with
the principles of notice and choice
4. Security: Take responsible measures to protect personal information from loss,
misuse, and unauthorized access, disclosure, alteration, and destruction
Seven Global Privacy Principles
5. Data Integrity: Assure the reliability of personal information for its intended use
and reasonable precautions and ensure information is accurate, complete, and
current
6. Access: Provide individuals with access to personal information data about them
7. Accountability: A fi rm must be accountable for following the principles and must
include mechanisms for assuring compliance.
Organization wide privacy control measures
Protecting privacy requires that big-data users become more accountable for their
actions. At the same time, society will have to redefine the very notion of justice to
guarantee human freedom to act (and thus to be held responsible for those actions).
Lastly, new institutions and professionals will need to emerge to interpret the
complex algorithms that underlie big-data findings, and to advocate for people who
might be harmed by big data.
1. From privacy to accountability: firms will formally assess a particular reuse of
data based on the impact it has on individuals whose personal information is being
processed. This does not have to be onerously detailed in all cases, as future privacy
laws will define broad categories of uses, including ones that are permissible without
or with only limited, standardized safeguards. For riskier initiatives, regulators will
establish ground rules for how data users should assess the dangers of a planned use
and determine what best avoids or mitigates potential harm. This spurs creative
reuses of the data, while at the same time it ensures that sufficient measures are
taken to see that individuals are not hurt.
Organization wide privacy control measures
2. People versus predictions: with big data we can predict human actions
increasingly accurately. This tempts us to judge people not on what they did, but on
what we predicted they would do. In the big-data era we will have to expand our
understanding of justice and require that it include safeguards for human agency as
much as we currently protect procedural fairness. Without such safeguards the very
idea of justice may be utterly undermined. By guaranteeing human agency, we
ensure that government judgments of our behavior are based on real actions, not
simply on big-data analysis. Thus, government must only hold us responsible for our
past actions, not for statistical predictions of future ones.
When they base these decisions mostly on big-data predictions, we recommend that
certain safeguards must be in place. First is openness: making available the data and
algorithm underlying the prediction that affects an individual. Second is certification:
having the algorithm certified for certain sensitive uses by an expert third party as
sound and valid. Third is disprovability: specifying concrete ways that people can
disprove a prediction about themselves.
Organization wide privacy control measures
3. Breaking the black box: the risk that big-data predictions, and the algorithms and
datasets behind them, will become black boxes that offer us no accountability,
traceability, or confidence. To prevent this, big data will require monitoring and
transparency, which in turn will require new types of expertise and institutions.
These new players will provide support in areas where society needs to scrutinize
big-data predictions and enable people who feel wronged by them to seek redress.
More recently, specialists in computer security and privacy have cropped up to
certify that companies are complying with the best practices determined by bodies
like the International Organization for Standards (which was itself formed to address
a new need for guidelines in this field).
4. The rise of algorithmist: These new professionals would be experts in the areas of
computer science, mathematics, and statistics; they would act as reviewers of big-
data analyses and predictions.
Organization wide privacy control measures
Algorithmists would take a vow of impartiality and confidentiality, much as
accountants and certain other professionals do now. They would evaluate the
selection of data sources, the choice of analytical and predictive tools, including
algorithms and models, and the interpretation of results. In the event of a dispute,
they would have access to the algorithms, statistical approaches, and datasets that
produced a given decision.
We envision external algorithmists acting as impartial auditors to review the
accuracy or validity of big-data predictions whenever the government requires it,
such as under court order or regulation. They also can take on big-data companies as
clients, performing audits for firms that want expert support. Internal algorithmists
work inside an organization to monitor its big-data activities. They look out not just
for the company’s interests but also for the interests of people who are affected by
its big-data analyses.
Thank you

You might also like