Professional Documents
Culture Documents
Spot On Journalists Struggle To Get Political Opinion Polls Right
Spot On Journalists Struggle To Get Political Opinion Polls Right
Three weeks later, when the Election Commission announced the results,
the bedrock of the NDA, Vajpayees Bharatiya Janata Party, had lost about
1.6 percentage points from the vote share it had garnered in the last Lok
Sabha elections, in 1999, sliding from 23.75 percent to 22.16 percent.
The Congress lost 1.8 percentage points, dropping to 26.53 percent of
votes nationwide. But despite similar declines in vote share, the two
parties achieved contradictory results. While the BJP lost 44 seats, its
main challenger gained 31. In the end, the NDA did not even touch 200,
falling 15 short. Congresss alliance marshalled 217, and eventually
secured enough external support to form a new government.
It wasnt the first time that political polls had been inaccurate in India
but calling the 2004 polls inaccurate is like calling a tsunami a ripple. Not
only did every major pollster and media outlet get the numbers wrong,
they miserably failed to predict the overall trend. Worse, they had
oversold their polls in a way that now seemed disingenuous. India Today,
which had conducted 155 opinion polls since 1978, would have known
that even the most rigorous survey can get the final outcome wrong. But
the magazine, and others in the media, presented their various polls as
definitive forecasts.
Some members of the press, humiliated by the discrepancy between their
headlines and the results, laid the blame on pollsters. I dont see any
reason why this magazine should carry the can for the incompetence and
ineptitude of desi and foreign pollwallahs who use the media in order to
pontificate on their brilliant scientific models, an angry Vinod Mehta,
then the editor-in-chief of Outlook, wrote. Some readers are convinced
Outlook manipulates its polls. I would like to inform them that I only pay
huge sums of money to pollsters so that they can embarrass me with
hopelessly inaccurate predictions.
TODAY, despite the embarrassment of years like 2004, there is little to
suggest that the media has become more cautious in selling opinion polls.
Survey after survey is proclaimed to be the definitive exercise in election
forecasting. Every seat, every state, every alliance and, finally, every
election, is the subject of self-assured prophesying.
Perhaps as a corollary, cynicism about opinion polling seems fairly
widespread. Since 1997, the Election Commission has contemplated
banning opinion polls, claiming, without clear evidence, that voters will be
unfairly swayed by survey results. In a letter written to the Election
Commission on 30 October this year, the Congress supported the proposal
to prohibit the dissemination of surveys once elections are announced.
The party said polls lack credibility and had the potential to be
manipulated.
But the scepticism isnt limited to political parties, which are generally
eager to protect the morale of their cadres from negative survey results.
In the last two general elections, the majority of opinion polls have been
off the mark when it comes to the number of seats the two major parties
will win, and this seems to have fostered public distrust of pollsters and
their methods. If the final number is wrong, many people tend to assume
a poll was either manipulated or based on flawed science.
In reality, however, most of the opinion polls published and broadcast by
national media houses are fairly robust exercises led by social scientists
with decades of experience. Although they sometimes get the final tally of
seats wrong, this isnt necessarily the result of poor polling methods or
bias. Rather, its a reflection of the inherent difficulty of distilling into a
handful of numbers the vastness and diversity of the Indian electorate
and the fickleness of Indian politicsa challenge of which pollsters and
editors are acutely aware.
But opinion polls are nevertheless presented as a sort of political gospel,
and the media often gives viewers and readers only the final outcome of a
surveyseat projections. But these projections are subject to greater
uncertainty than any other product of the entire polling exercise. Behind
the numbers are two complex, expensive processes. The first
comparatively easy but by no means simpleinvolves holding structured
conversations with a large number of people to understand how they are
likely to vote and to determine a party or candidates probable vote share.
The second is to create seat projections by subjecting vote share to
various mathematical models and to adjustments based on historical
precedents and a dizzying array of political factors.
Rarely does the complexity of these processes get conveyed by
journalists, many of whom feel the public are only interested in seat
projections. If we dont give seat number projections based on vote
shares, the viewer feels cheated, Rajdeep Sardesai, the editor-in-chief of
CNN-IBN, told me. Then the viewer says tum cop out kar rahe ho (You
are copping out). You are not telling me who will get how many seats. I
am not interested in the vote share. So after having spent so much
money, if you give only vote share then viewers and readers are upset.
Unfortunately, Sardesai added, most viewers and readers expect it to be
arithmetic, saying, why didnt you get this number right?
Most of the opinion polls available for public consumption are solely
commissioned by the media. But there is little consensus on why they do
it. Almost all the editors I met while reporting this story believe their
audiences are interested in the surveys, but theres no way to measure
that. Sardesai and other television journalists said it does not make much
of a difference to their viewership numbers. Newspaper and magazine
editors told me that in India, publications are largely subscription-driven
and readers seldom buy papers off the stands, so its impossible to detect
short-term fluctuations in readership. But Sardesai claimed that some of
the data can be used to improve reporting on topics that survey
respondents care about, and that, he thinks, is extremely valuable for any
editor.
Some journalists I spoke with called polls a publicity stunt, even though
its not clear that surveys contribute to a media organisations bottom
line. Others said that polls add value to their socio-political understanding
and internal politics of the winning party or coalition. Not for nothing has
India been called a pollsters nightmare.
In his later years, Ahluwalia told me, da Costa was really a very
impressive old gentleman, very refined, very erudite. But quite
opinionated, I thought. For the fourth Lok Sabha elections in 1967, da
Costa did several polls that I think were front-paged in the Indian
Express, Ahluwalia said. According to an article by the historian
Ramachandra Guha, published in The Hindu, a 1967 report by da Costa
forecasted the disintegration of the monolothic exercise of power by the
Congress party.
Although the Congress ultimately lost power in many states, Indira was
able to maintain control at the centre for the next ten years. In many
peoples eyes, da Costa got the most important part of his assessment
the overall picturewrong. (In fact, however, da Costas numbers appear
to have been fairly right on.) So I think pollsters went into hiding for a
while, Ahluwalia said, laughing. That got opinion polling on to a very bad
start.
The resurgence in Indian political polling began in 1979, when Roy and
Lahiri, then young professors at the Delhi School of Economics, created an
election forecasting tool called the Index of Opposition Unity. At the time,
Congress (Indira) was the single largest party and, even though it wasnt
in power, Roy and Lahiri theorised that the likelihood of a successful
challenge to the party depended upon the solidarity of its opponents. Roy
approached Purie, whom he knew from their days at the Doon School, to
see if he wanted to publish the results of their predictions.
Before starting India Today, Purie had studied with Sopariwala at the
London School of Economics. Sopariwala was now working with the Indian
Market Research Bureau under the leadership of Ahluwalia, who was only
in his late twenties but already heading one of the largest private market
research firms in the country. Ahluwalia and Sopariwala had previously
done some commercial polls for India Today and some political polls for
other publications. Purie brought them together with Roy and Lahiri. And
we got it right, several times, Ahluwalia said as we chatted in his
apartment overlooking the Arabian Sea in south Mumbai. Including, most
famously, the Rajiv Gandhi victory.
In 1980, the India Today team predicted a solid majority for the Congress
(Indira), which ended up getting 353 seats. In 1984, after Indiras
assassination, they officially gave Rajiv Gandhi up to 400 seats, but said
that these results might be understated and he could actually go on to
win even more. He won 415.
Five years later, they hit the bullseye with an exit poll accurately
predicting the Congress would win 193 seats. Purie threw a party at his
house to celebrate the poll just as the final results were being announced,
Sopariwala told me over email. The results for Uttar Pradeshs Pratapgarh
constituency came in around 10 pm, pushing the final tally to exactly 193.
our ambition in the long run, Sagarika, is to make polls something which
is not discussed too often. Doctors dont discuss thermometers.
Ghose tried to summarise: It is to catch the trend.
Earlier this year, I met Yadav, one of Indias foremost psephologists, in
his office at the CSDS campus in north Delhi. Yadav founded the Lokniti
Project at CSDS in 1997 to collect data on voters opinions before and
after every major state and national election. (He stepped down as the
head of the project last year, when he joined the Aam Aadmi Party.
Though he is still a part of the CSDS team, he does not participate in
producing opinion polls or seat forecasts now.) The projects goal is partly
to monitor the working of Indian democracy.
My sense is there is unnecessary mystique and a feeling of black magic
about polls in our country, Yadav said to me. Largely because these are
new things and people are not familiar with this stuff. Essentially, opinion
poll is nothing but a very systematic way of holding conversations with a
very large number of people. And the findings are presented not in the
form of quotations from those conversations but in the form of numbers,
because you cant report 5,000 conversations. In that sense they are not
in principle different from news reporting.
The trouble is that in our country opinion polls have been reduced to
election-related polls, election-related polls have been reduced to election
forecasting, he added. He spoke softly, confidently, calculating each
word. Now that is a very limiting way of looking at polls. The poll CSDS
did for CNN-IBN in 2012 estimated that 34 percent of votes in Uttar
Pradesh would go to the Samajwadi Party, and 24 percent to Mayawatis
Bahujan Samaj Party. This 10 percentage point lead was three times the
actual differencewell beyond the surveys stated margin of errorand it
seemed to bother Yadav. But Ghose was happy that her channel had
picked the winning side.
Part of the problem, pollsters feel, is that journalists dont understand the
usefulness of the data that underlie seat projections. They are
uncomfortable with numbers, said Yashvant Deshmukh, the head of
CVoter, one of the largest private socio-political polling agencies in the
country. He told me journalists dont look behind the final numbers to see
what went into the result. That is why media is coming up always with so
superficial, over-simplistic analysis, virtually pedestrian analysis, for every
mandate, he told me.
Since 1993, when Deshmukh founded CVoter after graduating in
journalism from the Indian Institute of Mass Communication, he has
conducted polls for almost all major English- and Hindi-language national
news channels; some regional ones; magazines including The Week and
India Today; the Hindustan Times and Indian Express; and various other
institutions, according to his companys website.
Over the period of all these years the media was not educated about the
That Deshmukh takes the blame and Goswamis channel gets the credit
holds true for more than just the final forecasts. Opinion polls are often
criticised for a lack of information on how they were conducted. In June,
for example, the economist Vivek Dehejia criticised polls in India generally
(and a specific poll in particular) for their poor reporting practices. The
GFK poll tells us only that interviews were conducted in respondents
homes and in street corners but gives us no indication that subjects were
picked randomly, Dehejia wrote in Business Standard. Also, as is typical
with Indian polls, we are not told the margin of error, so have absolutely
no way to assess the accuracy of the predictions.
Dehejia seemed to suggest that the lack of transparency in Indian polling
was the fault of pollsters. But methodology notes such as the one aired by
Times Nowalthough it would have failed standards established by
various international polling bodiessatisfy the limited guidelines
published by the Press Council of India (PCI). The PCI recommends that a
newspaper publishing a survey should indicate which institutions carried
out the survey, who commissioned it, the size and nature of the selected
sample, the method of selection of the sample, and the possible margin of
error in the findings. Even though the PCIs jurisdiction is restricted to the
print industry, it is the only institution in the country that produces any
sort of recommendations for publishing polls. Most of the media houses
adhere to them as indifferently as the PCI seems to lay them down.
Yogendra Yadav, who has been conducting polls and surveys since 1996,
said he wasnt aware of these recommendations: As people who carry it
out, at least we have never received any guidelines from anyone. He
agreed that there is a need to make opinion polls more transparent. I
have personally been fighting for it, he said. I have written about it. I
have written to all kinds of people who matter to say please make it
mandatory for every opinion poll to disclose their methodology in great
length. I should exactly know the methodology followed for the survey. I
should exactly know the method for vote-to-seat conversion. I should
know who paid for this survey, who was the customer, who commissioned
it, who paid money for the survey. And what is the track record of the
agency who is doing it? Is there any conflict of interest? These are
absolutely standard procedures.
Accusing just the pollster of opacity in an environment where there is no
code of conduct or monitoring body is unfair. It gives the gatekeepers of
informationthe news channels and publicationsa guilt-free pass. Some
outlets, such as CNN-IBN, which works exclusively with CSDS, do an
admirable job of ensuring the transparency of their surveys, putting
thorough methodological notes on their websites. But others fall far short
of this standard. If a poll is like reporting, only multiplied, its the editors
job to make sure his correspondent isnt relying on shoddy research,
personal bias or an outright plant. Like a bad story, if a bad poll is
published, the editor and the publication have to share the blame.
DORAB SOPARIWALA IS IMMACULATE AND POLITE. He is also the
only person I have ever met who talks about god as the woman up
there. His former colleague Titoo Ahluwalia called him, along with
sample more people than they need for a robust poll. The data are then
collated, and the vote share of each party is derived by tallying up the
individual responses.
All this makes polling a very expensive exercise. The price differs from
agency to agency and according to what is expected of the pollster. In the
last general elections, CSDS, which is a non-profit organisation, polled
roughly 30,000 people around the country. The total cost was around Rs 1
crore, Kumar said. About 20 percent of that was covered by CSDSs media
partners; the rest came in the form of grants from academic and research
institutions and government bodies.
Few for-profit agencies, if any, are able to make money from political
opinion polls for the Indian media. CVoter makes most of its money
outside of India, doing polls for foreign governments and international
organisations, according to Deshmukh. Market research firms, such as
ORG (formerly ORG-MARG), use the publicity generated by these surveys
to market themselves to corporate clients interested in understanding
consumer behaviour, which is how they make most of their income.
Sopariwala said that election forecasting is less than one percent of their
business, but ninety percent of the public face.
Many media houses share their costs and data with another partner. CNNIBN had The Week as a print partner for the assembly election polls last
year. This year, their election tracker was shared by The Hindu in July and
The Weekin October. CVoter is in a tri-party agreement with Times Now
and India TV, and is also doing polls for the India Today group. Deshmukh
did not tell me how much it costs him to do his polls, but it is presumably
cheaper than CSDS; the former relies mostly on telephone interviews,
while the latter only interviews their respondents face-to-face.
TO GET FROM THE RESULTS OF THE POLL to the seat numbers is a
mind-bogglingly complex process. Sopariwala and Ahluwalia think that the
biggest challenge is identifying those respondents who give interviews but
do not vote, thus distorting the analysis of the data. But this is only one of
a nearly endless list of factors that can go into seat projections.
Various indices and formulas have been developed over the years by
Indian psephologists. Prannoy Roy and Ashok Lahiri started with their
Index of Opposition Unity. Then, building on Roy and Lahiris work, came
the economist Surjit Bhallas Lying Index, which tried to offset the effects
of electors who dont vote. The process to fine tune these techniques is
ongoing.
Pollsters are continually trying to account for the ways that various layers
of political representationthe local MLA, the states incumbent party, the
member of parliament, the national incumbent party, the governing
alliance, the executive, and the prime ministeraffect voter preferences.
An individual may be unhappy with a politician or partys performance at
any one of these levels, and still vote for the same party at other levels.
have done it, but I cant release it legally, but let me tell you what the
survey is. It creates, absolutely, a culture of duplicity, rumours and so
on.
He mentioned Battleground 2009: Prannoy Roy, Dorab Sopariwala and
Shekhar Gupta sat and said, According to my intuition Congress will get
23 seats in Andhra Pradesh. Of course they had done a poll. Everyone
knew they had done a poll.
The ramifications of these bans, Yadav suggested, could go well beyond
surveys. The simple fact is you cannot say Prannoy Roy cannot have
intuition. Tomorrow I can have intuition. The question is, will we impose
this ban on people who write on the edit page? Will we impose this ban on
reportersthat they cannot say someone is ahead, someone is behind? If
we dont ban all that, how can we possibly ban this?
EDITORS ACROSS TELEVISION AND PRINT MEDIA told me that while
some bogus, rogue or manipulated polls find space in regional and
local news outlets, it is rare to find them on national broadcasts, or in
national publications. There are malpractices in the Indian industry,
Yadav agreed. But, If you look at some of the more professional polls, I
would say they compare with some of the best in the world. Nowhere in
the world are election forecasts 100 percent correct.
Still, there are valid criticisms to be made even of polls publicised in the
national media. Foremost among these is that pollsters methods and the
calculus that produces seat projections remain outside public scrutiny.
Almost every other suspicion about the technicalities of pollingabout the
size and representativeness of samples, margins of error, statistical
confidence, unresponsive interviewees, and adjustments based on
historical precedentscould be dispelled if pollsters and their media
partners were absolutely transparent about how they conduct their polls.
And this would enhance the credibility of the process. It should be
mandatory, Yadav said, for pollsters to disclose their methodology at
great length. Now with web there is no space problem. You can ask them
to put it up on the web.
Other sceptics question the motivations and political bias of the pollsters.
Recently, on blogs, Twitter and in an article in Open magazine, CVoter has
been called out for gravely overestimating the BJPs prospects in many
forecasts since 2004. Deshmukh, the founder, is a nephew of Nana Saheb
Deshmukh, one of the Rashtriya Swayamsevak Sanghs most revered
ideologues. For their part, Yadav and Sopariwala have faced accusations
of being sympathetic towards Congress. But none of this proves bias.
Here, too, transparency with regard to methods, and clear disclosures
about potential conflicts of interest, would help shed light on the validity
of such claims, and prevent bias from entering polls in the future.
In the end, however, it may be that people only believe the polls that
reinforce their beliefs. Part of this is human psychology, and part of this
may be that whatever lack of trust exists between the public and the