“NDA leads with 275,” said the cover of Outlook magazine in April 2004. The National Democratic Alliance, or NDA, lost that election, and the magazine declared in its subsequent issue that it would never use an opinion poll to forecast seats ever again. It has stuck to the decision.
It is not coincidental that most opinion and exit polls in the media are commissioned by TV channels and not by print publications. In the race for television rating points (TRPs), TV news needs to keep viewers glued to the screens. And, so, they commission opinion polls. Even when survey companies get their opinion and exit polls wrong, the same companies continue to get work. Look nobody else got it right, the companies say, but we got so many of them right in the past.
Sometimes, especially when the result is crystal clear to everyone, the survey companies do get it right. The question is, so what? How do opinion and exit polls serve any journalistic or democratic purpose by getting it right? What is the purpose of opinion and exit polls?
The purpose, we are told, is to gauge the political mood of the electorate. That is indeed a worthy journalistic exercise before, during and after an election. I fail to understand why we need opinion and exit polls to understand popular political mood. Political reporters seem to do it far better, every election. Besides, what is the point of exit polls, given that the real results are going to be out in two days anyway?
As Rajdeep Sardesai broadcast Cicero’s exit poll before the Bihar results, he seemed to be sceptical of the data Cicero was vouching for. Sardesai told Cicero’s Dhananjai Joshi on air that he will have egg on his face if he gets it wrong. It is a pity that even the best and the most experienced in political journalism in India need to spend hours of airtime discussing pollsters’ predictions that they themselves don’t trust.
As it is the Indian media’s credibility sinks further with every news cycle. By taking opinion and exit polls seriously, news channels are trading credibility for TRPs. They need to collectively decide that they don’t need these surveys to gauge public mood in elections. They need to put more faith in their reporters’ abilities to do so.
The CSDS flip-flop
Despite its small circulation, The Indian Express is unarguably India’s best and the most influential newspaper, one you can’t ignore even if you don’t agree with it on some days. It has an excellent team of political reporters that covered the Bihar election as well as anyone else. Vandita Mishra’s stories gave us a clear hint that the Mahagatbandhan was ahead.
Why, then, did the Express need to publish as a front-page lead story the Lokniti-CSDS pre-poll survey on October 7 that told us the NDA was winning? “Advantage BJP as Bihar gets ready,” the headline read. At the time I was travelling in Bihar and did not agree with its assessment. Then I saw that by evening, the paper had changed the article’s headline to “Surge for BJP-led NDA alliance in Bihar’s urban areas.” Since only 11 per cent of Bihar’s voters live in urban areas, I wondered if this change of headline meant that Express was no longer sure of the big-picture prediction by Lokniti-CSDS: that the NDA was four per cent ahead of the Mahagatbandhan.
A closer look at the article, authored by Suhas Palshikar and Sanjay Kumar of Lokniti-CSDS, and I felt their own numbers weren’t adding up. The swing voters in Bihar were the Extremely Backward Classes. The Lokniti-CSDS report said there was a “low” consolidation of EBCs in favour of NDA. According to my understanding, either side needed a high consolidation of EBCs, and I felt that that was happening in favour of the Mahagatbandhan.
The survey said other intriguing things, such as that 63 per cent voters were happy with Nitish Kumar’s performance as chief minister, whereas a considerably less 54 per cent were happy with Narendra Modi’s performance as prime minister, who wasn’t even seeking to be CM. The survey also found that in such a bi-polar contest, 39 per cent Muslims were seeking to vote for someone other than the NDA or the Mahagatbandhan.
In an article in Huffington Post the same day, I pointed out these discrepancies, saying the details of the survey actually show the Mahagatbandhan at an advantage. Within a few hours, Sanjay Kumar responded on the Indian Express by saying he stood by his survey. The original article’s headline was back to “Advantage NDA”.
On 12 October, Sanjay Kumar published an article in Firstpost saying the Bihar elections were too close to call, and don’t blame the polls if they get it wrong! He wrote, “But this election seems to be tough even for a common man. Ask anyone in Bihar a simple question: Is baar chunaav me kiske hawwa haiwhich way is the wind blowing in this election?) The most likely reply is: Kuch kah nahi sakte, takkar hai (can’t say for sure, it is a close fight).”
Clearly, Mr Kumar was distancing himself from his own survey even before the first vote was cast.
On November 7, Lokniti published in The Indian Express its post-poll survey results. A post-poll survey is one that is conducted a day after the voting, whereas an exit poll is conducted by surveying voters as they come out of the polling booth. A post-poll survey is by design less likely to get it wrong. The Lokniti post-poll survey said the Mahagatbandhan had taken a lead of four per cent over the NDA during the campaign. The keywords here are “during the campaign,” carefully chosen to suggest that the Lokniti pre-poll survey finding weren’t incorrect, but things had changed a lot in the month-long campaign.
If the shift happened during the campaign, why did Sanjay Kumar distance himself from his own survey even before the first phase?
The Express story on the post-poll survey did not have Sanjay Kumar’s byline, but that of Team Lokniti instead.
On November 8, it turned out that the Mahagatbandhan’s lead over the NDA was 7.8 per cent, nearly double of what the Lokniti post-poll survey had predicted.
Lokniti-CSDS is the most reputed election survey organisation, not least because it is not a commercial organisation like others. If this is the state of our best election surveyors, it is clear that surveys are an unscientific way of studying Indian elections.
We don’t need surveys
1) They are unscientific: Defending election surveys after the Bihar results, Yogendra Yadav, founder of Lokniti-CSDS but now a politician, wrote in Mint that there we need “scientific” surveys as they can’t substitute “anecdotal field reporting”. Election after election we have seen “anecdotal field reporting” and “casual conversation” give us a better understanding of elections. One reporter may talk to a few dozen, but if you multiply that with hundreds of reporters, then it’s a very large sample size they speak to. It is clear that what is unscientific are these surveys, not the anecdotal reporting or the casual conversations.
There are many reasons why surveys don’t get it right – voters are reluctant to speak up, sometimes they deliberately lie, the surveyors are poorly paid unemployed youth who cut corners, and there are allegations of political bias affecting media surveys. Whatever the reasons, it is clear that as a methodology, surveys are failing us. The conversion of vote shares gauged from surveys into seat forecasts, in particular, is as scientific as homeopathy.
2) They don’t help us gauge public mood: Yadav further wrote in the Mint article that the main purpose of election surveys was not seat forecasting but gauging public mood “between two elections”. Again, “anecdotal” field reporting and “casual conversations” do this so well that we don’t really need faulty surveys. Given that surveys are unable to correctly predict election results even in exit and post-poll surveys, leave alone opinion polls, how can we consider reliable their other findings about political mood?
We must also not forget that the ultimate reflection of public mood between two elections is the election itself. Opinion and exit polls are presented on TV as if they were real numbers, but in fact the Election Commission declares the real numbers and they serve very well the purpose of understanding the shifts in voter sentiment.
3) Media doesn’t spend enough to conduct good surveys: Yadav writes, “Most politicians who condemn polls on television privately commission these very agencies to carry out polls for them.” I know a pollster who does polls for a political party, and I asked him why he doesn’t do polls for the media. He told me that TV channels are not willing to pay what it really costs to do a good survey. He also said that unlike most media polls, he surveys every constituency, because parties need feedback on every constituency. Besides, a political party won’t hire again a survey company that gets it wrong. Such credibility issues don’t seem to affect those who do surveys for the media, because the media doesn’t seem to care about credibility.
4) They possibly hurt democracy: Yadav writes, “They may not acknowledge it, but politicians and journalists have all learnt to adjust their ‘gut sense’ in tune with the findings of the polls.” This is true, but if even politicians and journalists are affected by surveys, what of the common voter? The common voter is also affected by surveys, particularly floating voters who go with the hawa and decide their vote just before the polling. In other words, these faulty surveys are affecting voter behaviour.
When the Election Commission banned publication or broadcast of exit poll results between phases, the media made some noises, but didn’t go to the Supreme Court to defend free speech. That’s because, perhaps, the media also agrees that these polls can affect voter behaviour.
Privately, some pollsters admit this could be the case. Politicians and voters in Bihar both agreed when I asked them if election surveys were affecting voter behaviour. Perhaps there needs to be a survey to find out if election surveys are not merely gauging but also affecting public mood, thereby hurting our democracy.
In the United States, where surveys are increasingly making incorrect forecasts simply because people refuse to participate in them, there are growing voices that surveys are ruining democracy.
5) They never get it right: Yadav writes that getting the seats right is difficult in the first-past-the-post system, but the surveys usually pick the right winner. In that case, why do they venture out to predict seats either? If it is only about picking the right winner, then anecdotal field reporting and casual conversations do that very well, too.
In September, Sanjay Kumar and Pranav Gupta wrote in The Hindu that surveys usually get it right. They made a table showing that mostly, surveys pick the right winner. It is only if you read the article very closely do you realise they are defending exit polls, not opinion polls.
Truth is, they never get the seat forecast right. Praveen Chakravarthy, writing in The Hindu, analysed 82 election surveys across 13 Lok Sabha and state elections since 1996. Applying the +/- 5 per cent rule, he found that not a single survey came that close. None.
6) They are prone to political manipulation: According to former chief election commissioner SY Quraishi, “All parties have been seeking a ban on them since pollsters came to them promising to fudge. They were ready to manipulate margin of error, increase the number of seats, manipulate the sample size, facilitate publishing of opinion polls and give two reports — one honest and the other fudged.”
When the election commission asked political parties if they were in favour of banning opinion polls before elections, all but one agreed. The only party that supports them is the Bharatiya Janata Party.
Spending a lot of time in Bihar through the election campaign, I found to my surprise that top politicians on both sides of the divide knew in advance what opinion polls were going to say. Between phases, they even seemed to know what the exit polls had found.
Opinion and exit polls should not be banned. They have the right to do their unscientific surveys. It is the media that needs to ask itself: is there more to be gained or lost by sponsoring and publishing them?
Update: The story had erroneously identified the channel that conducted the sting on survey companies as Cobrapost. The channel that had conducted the sting is News Express. The error has been corrected.