Survey of Electors on Communications with Electors
Introduction
Phoenix Strategic Perspectives Inc. (Phoenix) was commissioned by Elections Canada to conduct a survey of electors on communications practices towards Canadian electors.
Background and Objectives
Elections Canada is an independent, non-partisan agency that reports directly to Parliament. The Agency is mandated to conduct federal general elections, by-elections and referendums, administer the political financing provisions of the Canada Elections Act (CEA), monitor compliance, and enforce electoral legislation.
During and after the 41st federal general election, Elections Canada received numerous complaints about automated telephone calls and live calls. The complaints alleged that callers falsely claiming to be from Elections Canada reported changes to polling places, when in fact there were no such changes, or that electors felt harassed by calls falsely purporting to be from a particular candidate or party, either because of the time or recurrence of the calls, or because of their tone. This issue has garnered considerable media attention.
At an appearance before a parliamentary committee last March, the Chief Electoral Officer committed to table a report on the allegations of wrongdoing before the end of March 2013.
The Agency decided to take this opportunity to address a wide range of issues dealing with communications with electors in the context of federal elections. The objective of the survey was to assess electors' opinions and attitudes on various issues related to communications with electors. More specifically, surveyed electors were consulted on the following issues:
- The practices of political parties and candidates in communicating with electors, including their preferences on being contacted by political parties and candidates.
- The protection of personal information.
- The sources of information used by electors to obtain knowledge about political parties and candidates as well as the electoral process during an election.
- The level of trust in the institutions and entities involved in the electoral process.
- The use of technologies.
The results obtained from this survey will increase the Agency's knowledge about the opinions and attitudes of Canadians on communication practices with electors. Additionally, the findings will be used to assist in the production of the Chief Electoral Officer's report on the complaints about automated telephone calls and live calls during and after the 41st federal general election.
Research Design
A random digit dial (RDD) telephone survey was conducted with 1,011 eligible electors from among the general population. Eligible electors are Canadian citizens, at least 18 years of age at the time of the survey. Based on a sample of this size, the overall results are accurate to within ±3.4%, 19 times out of 20 (adjusted for sample stratification). The margin of error is greater for results pertaining to subgroups of the total sample.
The following specifications applied to the survey:
- The sample frame included both landline and cellphone households. The sample was created in order to ensure representativeness at the regional level. The geographic distribution of the interviews was disproportionate in order to improve the accuracy of regional results. The regional stratification was as follows:
Area | Target Number of Interviews |
Actual Number of Interviews |
---|---|---|
Atlantic Provinces | 150 | 150 |
Quebec | 200 | 203 |
Ontario | 250 | 252 |
Prairies | 250 | 253 |
British Columbia | 150 | 153 |
Canada | 1,000 | 1,011 |
- A pre-test was conducted in advance: 16 interviews in English, 12 in French.
- All interviewing was conducted in the respondent's official language of choice.
- Interviews averaged 12.4 minutes in length.
- The fieldwork was conducted November 21st through December 2nd, 2012.
The following table presents information about the final call dispositions for this survey, as well as calculation of the response rate (using MRIA's Empirical formula):
Calls | Total Sample | Landline Sample | Cell Sample |
---|---|---|---|
Total Numbers Attempted | 25,036 | 13,823 | 11,213 |
Out-of-scope - Invalid | 8,075 | 3,568 | 4,507 |
Unresolved (U) | 11,485 | 6,003 | 5,482 |
No answer/Answering machine | 11,485 | 6,003 | 5,482 |
In-scope - Non-responding (IS) | 642 | 550 | 92 |
Language barrier | 170 | 129 | 41 |
Incapable of completing (ill/deceased) | 108 | 97 | 11 |
Callback (Respondent not available) | 364 | 324 | 40 |
Refusal | 3,663 | 2,771 | 892 |
Termination | 65 | 58 | 7 |
In-scope - Responding units (R) | 1,106 | 873 | 233 |
Completed Interview | 1,011 | 810 | 201 |
NQ - Quota Full - Gender | 59 | 50 | 9 |
NQ - Age | 23 | 0 | 23 |
NQ - Not a Citizen | 13 | 13 | 0 |
Refusal Rate | 77.12 | 76.42 | 79.42 |
Response Rate | 6.52 | 8.51 | 3.47 |
The MRIA response rate formula is as follows: [R=R/(U+IS+R)]. This means that the response rate is calculated as the number of responding units [R] divided by the number of unresolved [U] numbers plus in-scope [IS] non-responding households and individuals plus responding units [R].
Note to Readers
- For editorial purposes, the terms 'electors' and 'respondents' are used to denote survey participants.
- All results in the report are expressed as percentages, unless otherwise noted.
- In some specific cases where the sample size is noticeably small, the total unweighted number is presented instead of percentages.
- Throughout the report, percentages may not always add to 100% due to rounding.
- The number of respondents changes throughout the report because questions were often asked of sub-samples of the survey population. Accordingly, readers should be aware of this and exercise caution when interpreting results based on smaller numbers of respondents.
- At times, the number of respondents who answered certain questions or answered in a certain way is provided. The following method is used to denote this: 'n=100', which means the number of respondents, in this instance, is 100.
- Socio-demographic differences are identified in the report. The text describing these differences throughout the report is beginning of box put in a shaded boxend of box for easy identification. When reporting subgroup variations, only differences that are significant at the 95% confidence level, indicative of a pattern, and/or pertaining to a subgroup sample size of more than n=30 are discussed in the report.
- Where relevant, the results are compared to that of the May 2nd, 2011, federal general election, as a reference point.