Uncategorized

Why did 2024 Lok Sabha predictions miss the mark? Here’s the science behind exit polls

Experts say limitation of sampling methodology, reworking of uniform swing regions and structural issues could be behind polling industry mistakes

On June 4, many people took to X (formerly Twitter) to slam the exit poll predictions after counting began for the 2024 India Lok Sabha elections.

Chetan Bhagat, author and columnist, for example, wrote: “Either exit polls lied. Or interviewed voters lied to the exit polls. Or exit polls used flawed methodologies…” Sumanth Raman, sports commentator, said: “Those who did the exit polls must be held accountable. No way they could have missed the trends if they did the poll genuinely.”

Pollsters like India Today-Axis My India, Chanakya and ABP-CVoter, to name a few, predicted that National Democratic Alliance (NDA) would get 350-400 seats. Other pollsters estimated that the alliance would get over 350 seats.

But, on election day, the alliance managed to win only 293 seats, according to newspaper Indian Express, with Bharatiya Janata Party (BJP) bagging 240. The exit polls had grossly overestimated NDA’s numbers, predicting a landslide victory.

This is not the first time that exit polls have got it wrong. In 2014 and 2016, exit polls underestimated NDA’s numbers. In 2014, the alliance won 336 seats, while the former United Progressive Alliance (UPA) led by Indian National Congress secured just 59 seats, with the rest 149 going to others. 

“Almost all polls, including the exit polls, grossly underestimated the strength of NDA and overestimated the strength of the UPA, even though they did predict the victory of NDA,” read a 2021 preprint paper.

In 2019, the story was the same. Only two pollsters — India Today-Axis My India and News 24-Today’s Chanakya — got the numbers right. 


Read more: How accurate are exit polls? There’s no easy answer


So, why do exit poles go wrong? Down To Earth spoke with experts to understand the history of exit polls in the country and the reasons why pollsters probably overestimated the predictions.

What are exit polls? 

Exit polls are surveys conducted immediately after voters leave the polling stations. Pollsters use probability and statistics to forecast election results.

In 1936, Gallup, an American multinational analytics and advisory company, accurately predicted Franklin D Roosevelt’s victory over Alf Landon in the United States presidential election. The company based its prediction on the scientific sampling of a few thousand people.

“While this science has been around for a long time, the polling industry in India took off from the late 80s onwards, after a gap of close to 40 years,” Amogh Dhar Sharma, departmental lecturer at the Oxford Department of International Development, told DTE

This, according to Sharma, is crucial because, in the 80s, Indian elections became far more unpredictable. This period also witnessed a rise in the regionalisation of Indian politics.

“Though Congress won a historic mandate in 1984, it was reversed in the 1989 elections. I think this is why polls became so popular in Indian politics from the 1980s onwards. The Indian voter suddenly became a bit of a mystery for the political class,” he added.

In the 1980s, journalist Prannoy Roy conducted opinion polls during elections to gauge the mood of Indian voters. Thanks to the proliferation of electronic media in the 1990s, election surveys and exit polls grew popular in India. 

Pre-election surveys and exit polls have become a regular feature in the last one and half decades, political analyst Praveen Rai from research institute Centre for the Study of Developing Societies, wrote in the journal Economic & Political Weekly.

Exit polls could explain voter behaviour and early projections of election outcomes. According to Sharma, exit polls can tell you how people voted in terms of the demographic breakdown of the different socioeconomic groups.

However, the importance of exit polls is minimal. It is just for public consumption and media. It has no other impact, Rajeeva Karandikar, professor emeritus at the Chennai Mathematical Institute, told DTE.

How is it calculated? 

Polling agencies conduct large sample surveys, interviewing people on the party they voted for. Rahul Verma, a fellow at think tank Centre for Policy Research and visiting assistant professor at Ashoka University, explained that surveys should randomly choose people to answer these questions. The sample size should be large enough and representative of the population.

Some people might not want to record their responses. To avoid this, Karandikar said he would give his respondents a paper to mark their vote anonymously and request them to put it in a box. Another approach that some pollsters take is to ask respondents a series of questions without directly asking who they voted for. Based on the responses, the polling team deduces the party that an individual voted for.

Pollsters also perform modelling to predict voter behaviour. This has to be state-wise. “We see that each state is quite different from the neighbour. Even among socioeconomically similar districts, people can act differently because of the political history of those states,” he said.

What could have gone wrong this time? 

Karandikar speculated that a limitation of the sampling methodology that was followed for the exit polls could be to blame.

“For example, if all 10 pollsters follow reasonable methodology for sampling and analysis, they can be off the mark but unlikely to arrive at similar numbers — the errors are likely to cancel each other out, or at least the average is likely to be closer to the truth,” he noted.

Pollsters may have mixed and matched the data with expert opinion, Karandikar suspected. There is no way to know for sure as pollsters are not required to publish their methodology. 

Sharma said he believed the Centre for the Study of Developing Societies and the private player My Axis My India were being fairly honest with their methodology, even though they got it drastically wrong.

However, the industry of pollsters has gotten crowded and there is no quality control, he pointed out. “Very few people bother to ask these pollsters to be honest and upfront about the methodology,” he noted.

Verma said pollsters overestimated BJP’s seats in Uttar Pradesh, with Axis My India especially overestimating Maharashtra. Axis and CVoter overestimated BJP doing exceptionally well in West Bengal.

“With the same methodology, they [Axis and CVoter] have also been getting it right. They also predicted many elections in the last five years with the same methodology and they have also made mistakes, which is part of the process,” Verma explained.

Axis My India, for instance, claims an accuracy rate of roughly 95 per cent spread across 47 assembly elections, including 2 general elections.

They may have underestimated women samples in Bengal, who may have voted heavily in favour of Mamata Banerjee-led All India Trinamool Congress, he added.


Read more: Performance pays


Sharma also suspected the 2024 elections saw the reworking of uniform swing regions.

“If the trend in a certain sample of seats in a province is in a certain direction, you can be fairly confident that there will be a uniform swing in that regional cluster,” he highlighted.

Based on their analysis of previous verdicts, Sharma adds, psephologists – people who study elections — work with the assumption of certain regions being uniform clusters.

“What we are seeing right now is a coming apart and a reshaping of what those clusters might look like. But we need to wait for the data,” he explained.

Then, there could have been issues with vote-to-seat conversion. To convert vote shares to seats, Karandikar uses the probabilistic count method. 

“To put it very simply, let us say in one constituency, the gap between two leading candidates is 1 per cent. In the neighbouring constituency, the gap is 10 per cent. So we are far more sure about the person leading with a 10 per cent gap winning the seat than someone leading with a 1 per cent gap. This needs to be factored in when predicting the number of seats (for major parties) using the probabilistic count method,” he said.

Sharma also pointed to structural reasons. In Western democracies, such as the United Kingdom, there are silent conservative voters who refuse to declare their support despite voting for conservative parties. In India, people who are voting for or against the BJP for whatever reason tend to remain silent.  Every exit poll should consider this, he said.

Another reason, he speculated, is concerned with the number of parties. They went wrong with states, where the effective number of parties was greater than two, like Uttar Pradesh, West Bengal and Maharashtra. 

Do polling errors occur in other nations too?

The US has seen two most notable failures in the 1948 and 2016 presidential elections. 

In 1948, the failure was clearly due to non-random sampling. In 2016, it was mainly due to a very high non-response percentage [where voters choose not to respond], and the resulting bias despite weighting adjustments. 

Several state polls, especially in Michigan, Pennsylvania and Wisconsin, failed to observe the swing to 45th US President Donald Trump by many white blue-collar workers. 

Pollsters in the UK got the Brexit referendum wrong. According to British daily The Guardian, fewer than a third (55) of the 168 polls conducted since the European Union referendum wording was decided last September predicted a leave vote.

“On average, the exit polls in the UK have been fairly accurate. This could be because of the average size of the constituency in the country, which is shockingly smaller than in India. This is why, I think the job of a pollster in the UK is also, in some ways, easier. One can make accurate predictions on a much smaller sample size,” Sharma noted.




Source link

Most Popular

To Top