Professional Documents
Culture Documents
The Ethics of Corporate Virtues
The Ethics of Corporate Virtues
B. Indian Ethics
Dharma, or the righteous way to live our lives, is the Indian
version of ethics. Lord Krishna, in the Bhagwat Gita, elaborates
the concept of Dharma: “Every organism is born to serve a
purpose. Understanding the purpose and living accordingly is
Dharma.” In this definition, it is ethical to wage a just war, if
that is your purpose and as long as you are not seeking to gain
personal prestige or wealth or power—you can even kill your
cousins, if required. Vyasa, in the Mahabharat, expands on the
concept of Dharma— “To actively help those in need as well as
passively not harming others and being fair and just in one’s
judgements.” There are elements of Dharmashastra in Virtue
Ethics, Utilitarianism and Moral Rights Theory, except in
Deontology, which is a philosophy of absolutes—war is wrong,
even if it’s a just war; killing is wrong, even in self-defense;
stealing is wrong, even if to feed a dying person; lying is wrong,
even if that saves the life of an innocent person. Find other
ways other than war, killing, stealing and lying. Ethics is today
incorporated fully into the law of the land and it is western law
that most countries have adopted, with the exception of some
countries that follow the Shariat Law, which is based on the
Islamic version of the Divine Command Theory.
Lay and Skilling went on trial for their part in the Enron scandal in
January 2006. The 53-count, 65-page indictment covers a broad
range of financial crimes, including bank fraud, making false
statements to banks and auditors, securities fraud, wire fraud,
money laundering, conspiracy, and insider trading. United States
District Judge Sim Lake had previously denied motions by the
defendants to have separate trials and to relocate the case out of
Houston, where the defendants argued the negative publicity
concerning Enron's demise would make it impossible to get a fair
trial. On May 25, 2006, the jury in the Lay and Skilling trial returned
its verdicts. Skilling was convicted of 19 of 28 counts of securities
fraud and wire fraud and acquitted on the remaining nine, including
charges of insider trading. He was sentenced to 24 years and 4
months in prison.[102] In 2013 the United States Department of
Justice reached a deal with Skilling, which resulted in ten years being
cut from his sentence.[103]
Lay pleaded not guilty to the eleven criminal charges, and claimed
that he was misled by those around him. He attributed the main
cause for the company's demise to Fastow.[104] Lay was convicted
of all six counts of securities and wire fraud for which he had been
tried, and he was subject to a maximum total sentence of 45 years in
prison.[105] However, before sentencing was scheduled, Lay died on
July 5, 2006. At the time of his death, the SEC had been seeking more
than $90 million from Lay in addition to civil fines. The case of Lay's
wife, Linda, is a difficult one. She sold roughly 500,000 shares of
Enron ten minutes to thirty minutes before the information that
Enron was collapsing went public on November 28, 2001. Linda was
never charged with any of the events related to Enron.
Arthur Andersen was charged with and found guilty of obstruction of
justice for shredding the thousands of documents and deleting e-
mails and company files that tied the firm to its audit of Enron.
Although only a small number of Arthur Andersen's employees were
involved with the scandal, the firm was effectively put out of
business; the SEC is not allowed to accept audits from convicted
felons. The company surrendered its CPA license on August 31, 2002,
and 85,000 employees lost their jobs. The conviction was later
overturned by the U.S. Supreme Court due to the jury not being
properly instructed on the charge against Andersen.The Supreme
Court ruling theoretically left Andersen free to resume operations.
However, the damage to the Andersen name has been so great that
it has not returned as a viable business even on a limited scale.
iii. The Cambridge Analytica Scandal
The Facebook–Cambridge Analytica data scandal was a major
political scandal in early 2018 when it was revealed that Cambridge
Analytica had harvested the personal data of millions of peoples'
Facebook profiles without their consent and used it for political
advertising purposes. It has been described as a watershed moment
in the public understanding of personal data and precipitated a
massive fall in Facebook's stock price and calls for tighter regulation
of tech companies' use of personal data.
Aleksandr Kogan, a data scientist at Cambridge University, developed
an app called "This Is Your Digital Life". He provided the app to
Cambridge Analytica.[3] Cambridge Analytica in turn arranged an
informed consent process for research in which several hundred
thousand Facebook users would agree to complete a survey only for
academic use. However, Facebook's design allowed this app not only
to collect the personal information of people who agreed to take the
survey, but also the personal information of all the people in those
users' Facebook social network. In this way Cambridge Analytica
acquired data from millions of Facebook users.
In the US, the story of how whistleblower Christopher Wylie had
built media mogul Steve Bannon’s “psychological warfare tool” by
harvesting millions of people’s Facebook profiles had erupted across
every news channel. Questions rained in on Cambridge Analytica,
Facebook, and its boss, Mark Zuckerberg, including the most
insistent – where was he?
A couple of hours later, I glanced at Twitter and saw a graph. It
showed a wavering line heading off a cliff. Facebook’s share price
had plunged $30bn in the first two hours of trading. By the end of
the week it was more than $100bn. Today it’s $170bn down.
If there’s one tiny ray of light in all this, it’s that journalism can have
an impact – even the cash-strapped, shoestring British variety. And if
there’s a reason to despair, it’s that it’s not enough.
Zuckerberg, its founder and chief executive, has defied parliament.
The company is quite simply beyond the rule of law. Because what
the Cambridge Analytica story exposed, by accident, from Facebook’s
reaction in the months that followed, is the absolute power of the
tech giants. Power and unaccountability that is the foundational
platform on which populist authoritarians are rising to power all
across the globe. Power and unaccountability that continues
unchecked. In Britain, in a media landscape that is insular and self-
regarding and obsessed with what happens at Westminster, we’ve
failed to connect the dots between Facebook and Brexit and the
world outside. To the global currents that favour autocrats and
populists. And to the technology platforms assisting them.
On October 27, 2012, Facebook CEO Mark Zuckerberg wrote an
email to his then-director of product development. For years,
Facebook had allowed third-party apps to access data on their users’
unwitting friends, and Zuckerberg was considering whether giving
away all that information was risky. In his email, he suggested it was
not: “I’m generally skeptical that there is as much data leak strategic
risk as you think,” he wrote at the time. “I just can’t think of any
instances where that data has leaked from developer to developer
and caused a real issue for us.”
In 2013, two University of Cambridge researchers published a paper
explaining how they could predict people’s personalities and other
sensitive details from their freely accessible Facebook likes. These
predictions, the researchers warned, could “pose a threat to an
individual’s well-being, freedom, or even life.” Cambridge Analytica's
predictions were based largely on this research.
Instead, the scandal and backlash grew to encompass the ways that
businesses, including but certainly not limited to Facebook, take
more data from people than they need, and give away more than
they should, often only asking permission in the fine print—if they
even ask at all. There has been a growing recognition that companies
can no longer be left to regulate themselves, and some states have
begun to act on it. Vermont implemented a new law that requires
data brokers which buy and sell data from third parties to register
with the state. In California, a law is set to go into effect in January
that would, among other things, give residents the ability to opt out
of having their data sold. Multiple states have introduced similar bills
in the past few months alone. On Capitol Hill, Congress is considering
the contours of a federal data protection law—though progress is, as
always in Washington, slow-going.
If there’s one choice that Facebook has made repeatedly over the
past 15 years, it’s been to prioritize growth over privacy. Users were
consistently encouraged to make more of their information public
than they were comfortable with. The settings to make things public
were always a bit easier to use than the ones to make things private.
Data was collected that you didn’t have any idea was being collected
and shared in ways you had no idea it was being shared.
Now Mark Zuckerberg, the CEO of Facebook, is 34. He’s a public
figure who is attacked relentlessly in the press and by politicians
around the world. He has two children, a house he blocks from view,
and a cover on his laptop camera. He’s also seen his company get
burned for ignoring user privacy, and he’s seen that the platform he
built to make the world more open and connected can also be used
by harassers, racists, trolls, bullies, and Vladimir Putin. His company’s
reputation has faltered; growth on the main platform has slowed,
and employee morale has dropped. It seems like a good time for a
change.
“Public social networks will continue to be very important in people's
lives—for connecting with everyone you know, discovering new
people, ideas and content, and giving people a voice more broadly,”
Zuckerberg wrote. “But now, with all the ways people also want to
interact privately, there's also an opportunity to build a simpler
platform that's focused on privacy first.”
The company’s loose policies on data collection over the years are
also what allowed it to build one of the most successful advertising
businesses in history. All the data the company collects helps
advertisers segment and target people. And it’s the relentless pursuit
of that data that has led to Facebook being accused of making
inappropriate deals for data with device manufacturers and software
partners. This is a history that Zuckerberg knows well, and one that
he acknowledged in his post. “I understand that many people don’t
think Facebook can or would even want to build this kind of privacy-
focused platform—because frankly we don’t currently have a strong
reputation for building privacy protective services,” he wrote.
he fact that your individual messages might be encrypted in transit
does not, in any way, prevent Facebook The Entity from knowing
who your friends are, where you go, what links you click, what apps
you use, what you buy, what you pay for and where, what businesses
you communicate with, what games you play, and whatever
information you might have given to Facebook or Instagram in the
past.