REPORTS
ANALYTICS
INVESTIGATIONS
  • USD102.58
  • EUR107.43
  • OIL75.28
DONATEРусский
  • 3114

Since the beginning of Russia's full-scale invasion of Ukraine, opinion polls have shown a striking 80% support for Vladimir Putin's actions. However, experts agree that surveys do not reflect the real opinion of the people. In reality, most Russians are reluctant to speak to sociologists, and this makes surveys entirely unrepresentative of popular opinion. Sociologists also argue that in a political regime such as Russia’s, public opinion has no meaning, since a significant portion of the population makes no effort to form an attitude to political issues they believe they have no influence over. Moreover, these surveys can be detrimental, as they help the Kremlin to persuade the populace that the government enjoys widespread support.

Content
  • How opinion polls are really conducted

  • No one wished to answer

  • Wartime surveys

  • “I’ve got piglets to feed”

  • Polling in a concentration camp

Читать на русском языке

How opinion polls are really conducted

According to Levada Center, once a reliable and independent sociological centre, support for the war in Ukraine has risen from 68 percent to 75 percent, and approval of Vladimir Putin's actions has increased from 71 percent to 83 percent, with only 3 percent of respondents abstaining. However, the methodology of the survey remains unclear. The number of participants is unknown, as well as the refusal rate. Many surveys claim to have utilized door-to-door and personal interviews.

You need to know Russia to doubt the results right away – Russians are generally hesitant in opening their doors to strangers and are even less likely to express criticism of the government to someone they don't know and who is aware of their abode. Contemporary opinion polling is typically conducted by any of the four methods – by phone, in person on the streets or at the door, or via the internet. The latest is the the most convenient method, but the most challenging to control.

Door-to-door polling is considered the least anonymous, and responsible sociologists avoid it in surveys pertaining to matters of war and politics. Grigory Yudin, a Russian sociologist in exile, offers an explanation:

“In face-to-face surveys conducted on-site, various types of residences pose challenges in terms of accessibility. Factors such as fences, intercoms, or guards can hinder interviewers from entering certain premises, making it difficult to include those residents in the survey. However, for reasons unknown to me, the Levada Center persists in using route sampling. On-site polls, compared to telephone surveys, not only carry the risk of potential bias but also present greater difficulties in control. This is because interviewers, unfortunately, sometimes take it upon themselves to answer the questionnaires on behalf of the respondents, assuming they know what the responses would have been. In contrast, in telephone surveys conducted in call centers, where all conversations are recorded, such actions are much more challenging to carry out.”

The Russians’ attitude to the war is measured by two types of pollsters – either by state-controlled VTsIOM, FOM, Levada Center, or the two independent sociological projects, Russian Field and Chronicle. Both projects rely entirely on telephone surveys. Sociologist Daria Pavlova from Russian Field sheds light on how they ensure representativeness in their surveys:

“We start with sample modeling, where an approximate understanding of the population composition in terms of age and other demographics is established for each region. For example, if the goal is to survey 100 people, the quotas are set accordingly, with a predetermined percentage allocated to different age groups, such as 15% for young people and 30% for retired people, and so on. These quotas are then translated into parameters within survey apps, where researchers can specify the required age and gender of participants. Once the required number of respondents for each category is reached, the survey is closed for entry. When a person agrees to participate, the operator inquires about their age, and the gender is determined based on the voice. If the person fits within the predetermined quota, they are included in the survey. Towards the end of the survey, additional socio-demographic parameters, such as education level, income, and some others, are collected because those provided by Rosstat (Russia's Federal State Statistics Service) are not reliable, and there is also insufficient data on education levels, which poses an additional challenge in this regard.”

In contrast to Levada Center, Russian Field and Chronicle also consider the proportion of individuals who agree to participate in the interviews. The big pollsters intentionally conceal the number, likely because the percentage of people in Russia who are willing to be interviewed is exceptionally low.

No one wished to answer

The response rate (RR) in Russia, similar to many other countries, has experienced a decline over the years. When asked about the trustworthiness of surveys with a low RR, sociologist Alexei Levinson from the Levada Center says that the RR observed in Western countries is low, yet polls are trusted.

But it is not the RR itself that is important, but how a decrease in RR affects the nonresponse bias. For instance, in the United States, RR in telephone surveys has dropped from approximately 40% in the 1990s to 6% today, causing concerns among sociologists. However, researchers have concluded that even with such a low RR, surveys can still maintain representativeness if an individuals' decision to participate is unrelated to their responses. In the United States, a decline in response to telephone surveys can be attributed to the prevalence of spam disguised as surveys, and people from all demographic categories equally dislike such spam. Therefore, if one seeks to gauge the American public's opinion on presidential candidates, a low RR would not significantly affect the margin of error. In Russia, the RR in telephone surveys is comparable to that of the United States (around 5-7%), but the reasons for non-response is often directly related to politics.

In Russia, approximately 94 out of 100 people do not participate in telephone surveys conducted by sociologists

Daria Pavlova offers insights into respondents' motivations, providing a few examples:

“The primary refusals end in silence, when hang up abruptly without explanation. However, there are also those who verbalize their reasons. We often hear: 'Why are you calling me? I'm neither a politician nor an expert; there is no need for you to inquire with me.' The second reason frequently cited is a lack of time and interest, with respondents stating, 'I'm busy and uninterested in engaging in conversation.' The third reason: “I'm afraid of being scammed, you will ask for my passport details, you know my phone number, it means you'll be able to steal money from me.” Lastly, the fourth category of refusals stems from anxieties surrounding calls originating from Ukraine. At a certain point on pro-Russian online forums members start advising people to query the caller with the question, 'Whose is Crimea?' If the operator fails to provide a response, it is assumed that the call originates from Ukraine, thereby prompting individuals to refuse further engagement.
Operators complain about a growing number of refusals. They are bound by the guidelines and cannot deviate from it or themselves answer questions beyond those pertaining to the survey's methodology. When confronted with queries, operators clarify their role and tell respondents that they cannot respond to the questions. In response, some people retort, “So you're calling from the Armed Forces of Ukraine? Goodbye!” Moreover, a non-negligible bunch of people believe these calls come from provocateurs. Additionally, some individuals withdraw due to privacy concerns, that can end in potential encounters with the FSB agents.”

Mr. Yudin emphasizes that lack of trust, especially towards the government, is cited as a key reason for refusals. In Russia, there is a deep-seated belief that pollsters are linked to the government:

“In Russia, surveys are viewed as a form of interaction with the government, as a tool of oversight. In America, however, it is different; we understand that surveys are conducted by various institutions, which may have different political affiliations. Some may lean towards the Democrats, some towards the Republicans, and some are purely academic. But no one assumes that the government is behind them. In Russia, the majority of respondents firmly believe otherwise, and many consider it a valid reason tocomplain about the 'bad boyars' and the ‘good tsar’. It is common to hear from respondents: 'Tell Putin that...' . It is futile to explain that you have no bonds with Putin and that you represent an independent organization, the Levada Center. They throw back, 'Who are you trying to fool? Putin was on TV yesterday, proposing a survey, and today you're telling me you have nothing to do with him.' Levada Center cannot be held responsible for this, but it does not change the situation.”

Levada Center vehemently challenges these arguments. In 2022, the organization conducted a dedicated study to know more about those who refuse to participate in surveys, and came to a stunning result that there is no fundamental difference between responders and non-responders. The sociologists focused on individuals who had responded to their surveys in the previous year. They contacted these respondents for the subsequent survey and compared them to those who declined. Consequently, the study primarily examined those individuals who typically engage with sociologists but chose to disregard their inquiries in that particular year. The response rate (RR) was 30%, representing a proportion of responses already present in their database—individuals who typically engage in conversations with sociologists. Thus, even if these individuals themselves exhibited a response rate of 30%, it is plausible that a randomly selected sample would yield a significantly lower figure, perhaps closer to the 5-7% response rate observed in the broader Russian population. The study seems to support the hypothesis that Russians avoid survey participation due to a lack of trust in the government and apprehension towards sociologists. According to the study's findings, the highest refusal rates are among young Russians, particularly those aged 24 and younger. Notably, this age group tends to be more oppositional and critical of the government in Russia.

Wartime surveys

For a long time, opinion polls spurred little interest in Russia. The absence of free elections and limited interest in public opinion except for the times of mass protests, contributed to this apathy. However, after the commencement of a full-scale invasion of Ukraine, the global media suddenly became interested in how “ordinary Russians” perceive the situation. It was at this moment that Levada Center returned triumphantly, confirming popular support for government's policy. Nonetheless, certain questions remained unanswered.

If the war truly enjoys widespread popularity, why do so few Russians adorn their vehicles with Z and V symbols? Why is there a scarcity of volunteers in the military? Why are there no grassroots demonstrations in support of the war, except for those orchestrated by the authorities? Why are influential figures from the entertainment industry, once loyal and well-integrated, like Alla Pugacheva, the most influential Russian pop artist of the past 50 years, or her husband, comedian Maxim Galkin, powerful producer and “star-maker” Iosif Prigozhin, or Valery Meladze, a highly popular singer of Ukrainian descent, speaking out against the war? Despite these questions, Western media outlets take Levada’s figures for granted.

The statistics favourable to the government may be one of the reasons why Levada Center has not been labelled an “undesirable organization” — a status that hinders any activity in Russia.

Independent sociologists have strong evidence that the polls are distorted. Elena Koneva, founder of the independent research agency ExtremeScan and a participant in the Chronicle project, explains:

“The initial outcomes of the street polls conducted in February, which indicated a 58% support for the war, took us aback. It appeared utterly implausible. We began searching for plausible explanations, making a hypothesis that socially appropriate responses play a role. When the fake news law came out a week later, we grew to believe that it could distort the results of our surveys. Subsequently, we decided to verify the results by adding two more response options - 'I have difficulty answering' and 'I don't want to answer that question.'
This seemingly insignificant modification had a notable impact on the results. Throughout our study, an unexpectedly high proportion of respondents (31%) encountered difficulty or refused to provide a direct answer to the question regarding their support for the war, which is striking considering the significance of this issue in the country. The peak of evasive responses (36%) occurred on September 29-30, 2022, immediately following the announcement of “partial” mobilization. Verbally, only 10-13% of respondents openly stated that they did not support the war throughout the year. However, the data reveal that the actual number of non-supporters is several times higher, as many individuals opted for the response category of 'I do not want to answer this question' due to security concerns.
It is also crucial to carefully interpret the numbers at face value. Individuals may attribute different meanings to the concept of 'support.' My personal estimation is that real support for the war amounts to approximately 35%, while a significant portion, at least 30% of Russians, are genuine opponents of it. When referring to genuine support, I refer to cases where individuals can offer additional arguments or demonstrate a willingness to take action, such as engaging in combat or providing financial support, among other forms of involvement.”

The premise that Russians might be hesitant to express their true opinions to sociologists has strong grounds. Grigory Yudin cites the incident with German media outlet Deutsche Welle, which conducted a survey in Moscow on the topic of supplying Leopard tanks to the Ukrainian military. Several respondents openly expressed their support, but their statements led to the initiation of criminal cases once the footage was shared on YouTube.

Contrary to the claims of Alexei Levinson, the apparent head of socio-cultural research at the Levada Center, there are dissenting opinions on respondents' lack of fear. Yudin, along with Daria Pavlova from Russian Field, believes that the ongoing war significantly impacts the sincerity of answers. Pavlova finds it difficult to believe Levada's assertion that one in four households willingly opens their doors for conversations, emphasizing the challenges of tracking refusals in street surveys and door-to-door interviews. Despite attempts by Russian Field to employ technical solutions, the inherent human factor hampers their effectiveness in accurately counting refusals.

“I’ve got piglets to feed”

According to independent sociologists, the primary challenge comes not from design, but the way the polls are conducted. The starting premise is that respondents possess pre-existing opinions on significant political matters. But what if they don’t?

Yudin asserts that general indifference, stemming from learned helplessness, is even more pronounced than fear. The biggest issue is not about people being pro-Putin or anti-Putin, Yudin says, but rather that they just don't care. The response option of “I have difficulty answering” doesn't capture it. So, when people who don't give a damn are put on the spot, they start inventing things in the spur of the moment. They think, Yudin assumes, that since they agreed to answer, they have something to say about the “special military operation.” If they believe that Putin knows what he's doing, that implies they support it.

A large – perhaps the largest – group of the population is not those who are for or against the war, but rather those who refuse to think about it, Pavlova concurs. Russian Field’s analysis reveals that the largest group consists of those who simply don't care enough to reflect about current events. They neither support nor oppose. One of their respondents actually answered that he would rather give food to his two cute little pink piglets than answer the poll’s questions.

The Russian authorities are skillfully freeriding on social apathy and indifference. Their priority is not so much to convince the Russians that Putin is doing the right thing, but rather to create the illusion that everyone around them does to dissuade from any protest. Independent sociologists highlight this as the main danger of the Levada polls: the widely discussed mythical 80% support figure helps the Kremlin cultivate the loyalty of the population.

Polling in a concentration camp

Does this suggest that asking political questions is off-limits in present-day Russia? Not exactly. The implication is that the focus of these surveys should shift from revealing the «opinions» of particular demographic segments to uncovering intriguing patterns or phenomena, according to Grigory Yudin.

Yudin finds Russian Field’s approach intriguing:

“They've implemented a scheme where they ask a set of two questions, both starting with “if tomorrow Vladimir Putin decides that...' - followed by two contradictory decisions, such as continuing the 'special military operation' or ending it. Surprisingly, in both cases, a vast majority of respondents express support for both decisions, even though the choices are mutually exclusive. Why so?”

The sociologist explains that it's because the actual question people hear is whether they support Vladimir Putin, and naturally they tend to say yes.

“Those kinds of polls don't say anything about the “will of the people.” However, this doesn't imply that polls are entirely without value. They can be of some domain specific interest, or they can identify differences in perceptions of different groups, based on they age or well-being. However, these very same details are never disclosed by the pollsters. If they were, they would immediately undermine the picture of a unified and homogeneous society.”
Those kinds of polls don't say anything about the “will of the people”

Apart from surveys, there are qualitative sociological studies, often referred to as “qualitative research.” While these studies may not provide the exact number of supporters for specific ideas, they can shed light on the reasoning, logic, and arguments of various groups. This includes both those who support and oppose the war, as emphasized by sociologist Elena Koneva. She insists that there is no widespread high level of support for the war:

“The support that exists is sometimes explained by nostalgia for the Soviet Union and the “revival” of imperial thinking. However, research does not support this claim. As for the opponents of the war, who showed no interest in politics prior to the war, they exhibit an anxious, depressed, and seemingly passive profile. These people were invested in peaceful lives, working in skilled professions and raising families. Despite isolation and pessimism they feel now, the people understand the gravity of consequences of the war their country unleased. Once they establish themselves in this position, this will release the potential for transition from silent non-support to open and admissible forms of anti-war activism.”

*VTsIOM also publishes its response rate, and it is extremely low for telephone surveys

Illustrations throughout the article generated by MidJourney.

Subscribe to our weekly digest

К сожалению, браузер, которым вы пользуйтесь, устарел и не позволяет корректно отображать сайт. Пожалуйста, установите любой из современных браузеров, например:

Google Chrome Firefox Safari