The survey was created for multiple uses, one of which is to inform this Submission into the Senate inquiry into the preparation, administration and management of the 2016 Census due to report to parliament at the end of November.
The #CensusFail survey provides a way for people to have input into the Inquiry without creating their own submissions and as such, speaks for a large number of people, many of whose voices may otherwise not be documented in any permanent manner or perhaps at all. As such, care will be taken with the interpretation of results to ensure those voices are adequately and appropriately represented.
While the census itself attempts total enumeration (in the event that 95% participation is received which is under considerable doubt at the time of writing), the #CensusFail survey provides detailed data on the sub-population concerned with the conduct of the 2016 census and those interested in the dissent surrounding it.
Specifically, the following terms of reference are particularly relevant to survey questions.
In discussing the impacts on return rates and data quality of the many failures surrounding the 2016 census, it is worth noting that the ABS has provided a moving target in reporting census form response rates. According to the ABS, while the 2011 census was sent out to 9.8 Million households, only 9.1 Million households were subsequently recorded in the population figures - a substantial difference.
The ABS 2016 Census 'cheatsheet' claims 'The 2016 Census will count close to 10 million households and approximately 24 million people in Australia on Census night.'. The figure of 10 million households has been used in several media stories and is the only total figure I could find associated with expectations for the 2016 census.
According to former Head of the ABS, Bill McLennan, a 95% completion rate is necessary for the census data to be usable. Based on the ABS figure for number of households in Australia, 9.5 Million forms need to be returned with usable data. Contradicting this logic, on September 2, Census Head Duncan Young claimed that the 7 Million forms received at that time equated to an 80% completion rate. 7 Million is 70% of 10 Million which indicates that the total households figure being used by the ABS to calculate completion rates is being revised down to hide the non-completion rate.
Given what might be interpreted as an effort on the part of the ABS to change their targets and conclusions based on response rates, it can be seen that a survey to find out what people are actually doing in terms of their response behaviours to the 2016 census is an important step in filling in a missing piece of the puzzle.
The #CensusFail survey is not intended as a random sample to be generalised to the entire population but as an opportunity to gather information of the behaviours, concerns and beliefs of those particularly concerned with the 2016 census. As such and for reasons of practicality, purposive/snowball sampling was used to gain respondents mainly through Twitter which has sustained a discussion of the issues throughout the weeks leading up and away from census night.
It is notable that some questions (Q3, 6, 8, 9, 10) about the de-anonymisation of the census received almost universal consensus. Other questions relate to specific responses to the concerns people had about the 2016 census and provide data to policy makers and researchers to help them determine the likely quality issues that may afflict the 2016 census data.
The survey was carried out using Survey Monkey due to time constraints and convenience. 546 responses were received within the time frame leading up to September 13 when analysis for use in this submission began and the survey closed. While the sample size is large enough in numerical terms to be generalised to 10 Million households, the sampling technique means that for some questions (Q1, 2, 4 & 5), assumptions about this sample mean it would not be appropriate to generalise response patterns to the total population. In these questions it is the ways in which the respondents do vary in behaviour and belief from the wider population that the survey intends to document.
Having said that, there are questions in the survey in which the sample population is unlikely to vary substantially in relevant characteristics from the general population and these questions (Q3, 6, 8, 9, 10) could reasonably be claimed to represent not only the sample survey but the wider population.
Throughout the 2016 census period, the ABS has remained steadfastly uncommunicative regarding the problems experienced by a vast number of census respondents. The introduction of widespread use of the online form and the manner in which this change was implemented created widespread frustration and confusion on a scale suggestive of gross maladministration.
The #CensusFail survey is in some regards a response to what has been seen by many as a campaign of misinformation by the Australian Bureau of Statistics. Rather than be told by an organisation which is suffering a gross loss of trust how people are responding to the census, an independent survey provides an alternate source of information and gives a voice to the many people whose concerns have gone unacknowledged by the ABS.
The survey (and this submission in it's entirety) was designed by Rosie Williams who holds a degree in social research/public policy and who has been a key contributor to discourse on the privacy issues introduced in the 2016 census. Click next or use the menu on the left to navigate through the survey responses.