Those relished and cursed public opinion polls
If a public opinion poll about polls were to be taken, odds are that a majority would offer some rather unfavourable views of pollsters and the uses to which their work is put. In contrast, if you asked whether politicians, business leaders, and journalists should pay attention to the people’s voices, almost everyone would say yes
If you have not really thought in this direction, let me break it down for you: Politics is the biggest marketing gimmick. Businesses may not rely on data to make their next move, but politics is almost immune to making blank decisions. Over the last few years, newspapers, magazines, radio, and television have all been discussing and wrangling over the results of surveys.
From election fever to understanding the voter base, and from launching a political campaign to getting on the quirks of the right manifestos, government agencies, political consultants, political parties, political action committees, etc. need political surveys to cater to impressive mass hysteria. Media outlets too, enjoy their share of quick polls on hot-button issues. There are a lot more sides to political surveys.
Survey research is a quantitative and qualitative method with two important characteristics: The variables of interest rely on respondents’ self-reported measures rather than surveyors’ observations, and sampling plays an important role in survey research. Surveys can be used to describe single variables (e.g., the percentage of Maltese who support abortion) or to explore statistical relationships between variables (e.g., the relationship between income and partisanship).
In doing so, surveys should provide a useful means to navigate through individuals’ opinions and preferences regarding various issues. However, surveys are more than just asking people about their opinions.
Whether it is to explore Maltese public opinion about same-sex marriage or the Maltese public’s thoughts about immigration policies, surveys provide a good means to infer what our population of interest has in mind. Survey research consists broadly of four stages: developing the survey, sampling, fielding the survey, and analysing the results.
However, even a well-designed survey could go wrong – sometimes for reasons outside the researchers’ control! In addition to errors pertaining to the sampling stage, there are other potential errors and disadvantages because surveys involve human responses.
Participants might become tired of answering the survey questions, click “don’t know,” give out random responses, or simply skip the questions and not answer at all, and as a result, the quality of their responses deteriorates. Even small differences in wording can alter the survey results.
The most common ways in which public opinion polls are conducted are online, by telephone, and through face-to-face interviews. Polls tell us what proportion of a population has a specific viewpoint. They do not explain why respondents believe as they do or how to change their minds. That is the work of social scientists and scholars. Polls are simply a measurement tool that tells us how a population thinks and feels about any given topic at any given moment in time.
This can be useful because it gives people a chance to speak for themselves instead of letting only vocal media stars speak on behalf of everyone. Opinion polling gives people who do not usually have access to the media an opportunity to be heard.
Yet, if a public opinion poll about polls were to be taken, odds are that a majority would offer some rather unfavourable views of pollsters and the uses to which their work is put. In contrast, if you asked whether politicians, business leaders, and journalists should pay attention to the people’s voices, almost everyone would say yes. And if you then asked whether polls are, at least, one tool through which the wishes of the people can be discerned, a reluctant majority would probably say yes to that too.
Several conundrums of public opinion polling are enfolded in this hypothetical tale. People of all kinds, activists and ordinary citizens alike, regularly cite polls, especially those that find them in the majority. But people are deeply sceptical of polls, especially when opinion moves in the ‘wrong’ direction.
Some of their doubts are about pollsters’ methods. Do they ask the right questions? Are they manipulating the wording of questions to get the responses they want? And whom did they interview? Some of the doubts are wrapped up in a mistrust of the political parties, marketers, and media giants that pay for the polls.
Sometimes, respondents offer opinions on subjects about which they have not thought much and do not care at all. People sometimes answer pollsters’ questions just to be polite—because they figure they probably ought to have an opinion. That gives pollsters a lot of running room to ‘manufacture’ opinion, especially on issues of narrow rather than wide concern.
When interest groups commission pollsters to ask leading questions to gather ‘scientific’ proof that the public agrees with whatever demand they are making on government, they demean polling and mislead the public. When analysts, sometimes innocently, use poll numbers as a definitive guide to public opinion, even on issues to which most people have given little thought, they are writing fiction rather than citing fact.
Relationships between citizens and leaders, between public opinion and democratic governance, are complex. Contemporary politicians too often put their fingers to the wind of public opinion when deciding what policies to advance. Yet the fragility and ambiguity of public opinion make the use of polls problematic as a direct, dominant guide to formulating public policy.
Simply put, the birth and growth of polling and surveys cannot be met with uniform acceptance or approval. When high-profile political outcomes do not match what polls show, it fuels speculation that people taking surveys have deceived the pollsters. If you doubt that, just take a poll.