2011 Census Form Design Report

2011 Census Form Design Report
Liability
While all care and diligence has been used in processing, analysing, and extracting data
and information in this publication, Statistics New Zealand gives no warranty it is error free
and will not be liable for any loss or damage suffered by the use directly, or indirectly, of
the information in this publication.
Reproduction of material
Material in this report may be reproduced and published, if it does not claim to be
published under government authority and that acknowledgement is made of this source.
Citation
Statistics New Zealand (2010). 2011 Census form design report.
Wellington: Statistics New Zealand
Published in December 2010 by
Statistics New Zealand
Tatauranga Aotearoa
Wellington, New Zealand
Contact
Statistics New Zealand Information Centre: [email protected]
Phone toll-free 0508 525 525
Phone international +64 4 931 4610
www.stats.govt.nz
ISBN 978-0-478-35387-7 (online)
2011 Census Form Design Report
Preface
The 2011 Census Form Design Report describes the improvements that have been
made to the 2011 Census forms.
The 2011 Census forms are very similar to those used in the 2006 Census. As no
questions have been added or deleted, the overall number and order of questions
remains the same.
Statistics New Zealand’s strategy for the 2011 Census is minimal content change and,
in line with that strategy, no new topics or questions have been added to the 2011
Census forms. Instead, we have focused our resources on promoting the online
census option, and on making only minor changes to the census forms and guide
notes; changes which emphasise data quality while maintaining consistency with past
censuses.
The report is the final in the series of reports on the 2011 Census content development
process, which includes the:
• 2011 Census Content Report (October 2009)
• 2011 Census Content: Submissions Report (November 2008)
• Final Report of a Review of the Official Ethnicity Statistical Standard 2009
(October 2009).
Geoff Bascand
Government Statistician
4
2011 Census Form Design Report
Contents
Introduction .............................................................................................................. 6
Testing overview ...................................................................................................... 7
Cognitive testing .................................................................................................... 7
Mass form completion ............................................................................................ 8
Online form usability tests ...................................................................................... 9
Pilot test – March 2009 .......................................................................................... 9
Census dress rehearsal – March 2010 .................................................................. 9
General changes to the 2011 Census forms......................................................... 10
Government Statistician’s message ..................................................................... 10
Office use boxes .................................................................................................. 11
Environmental sustainability messaging............................................................... 12
Other changes to forms ....................................................................................... 13
Changes to typesetting ........................................................................................ 13
Census dateline ................................................................................................... 13
Census website promotion ................................................................................... 13
Māori/English form front page .............................................................................. 14
Changes to questions in the 2011 Census forms ................................................ 15
Individual form changes ....................................................................................... 15
Dwelling form changes......................................................................................... 27
2011 Census Māori/English form development .................................................... 33
Background ......................................................................................................... 33
2011 form development ....................................................................................... 33
2011 Census guide note development .................................................................. 37
Background ......................................................................................................... 37
Print guide notes – English .................................................................................. 37
Print guide notes – Māori/English......................................................................... 37
Online guide notes – English and Māori/English .................................................. 37
2011 Census guide note changes ........................................................................ 37
Communications changes .................................................................................... 38
Brief descriptions of guide note changes for 2011 Census ................................... 38
2011 Census online form development ................................................................ 40
Background ......................................................................................................... 40
Features of the 2011 online census forms ........................................................... 40
Māori/English online form features ....................................................................... 43
5
2011 Census Form Design Report
Introduction
The 2011 Census of Population and Dwellings will be held on Tuesday, 8 March 2011.
Statistics New Zealand aims to collect census information from everyone who is in
New Zealand on census day. Census information is collected through the census
forms, which can be completed either online or on paper and in either English or
Māori.
Information from the census is used to help determine how billions of dollars of
government funding is spent in the community. Therefore, data quality and accuracy
are important, while maintaining consistency across censuses.
This report describes the improvements that have been made to the 2011 Census
forms within the context of the minimal content change strategy adopted for 2011. For
each change made, a rationale is given, along with pre-testing findings and
recommendations for future development. It also summarises the testing methodology,
and discusses development of the guide notes, Māori content, and the online form.
This report completes the series of reports detailing the 2011 Census content
development process. For more information on this process, see: 2011 Census
content development, available from www.stats.govt.nz.
6
2011 Census Form Design Report
Testing overview
A robust programme of pre-testing is central to any development project for census
forms. Pre-testing seeks to confirm the forms are collecting data of acceptable quality.
It also allows the form design team and stakeholders, such as census data users, to
evaluate the merits of proposed changes to form design.
With the design project for the 2011 Census forms limited to minimal change, the
testing programme was significantly reduced compared with the programme for the
2006 Census forms. Comprehensive cognitive testing was still carried out, though the
total number of tests (152) was less than for the 2006 Census (250).
In the 2006 Census testing programme, there were five field tests, but the 2011
Census testing programme consisted of only two field tests.
Cognitive testing
Cognitive testing is a method used to test how well a question meets the combined
needs of respondents and data users. A good question should be both easy for
respondents to accurately understand and answer, and collect data of sufficient quality
for data users.
Typical cognitive testing involves one-to-one interviews with respondents who
complete census forms while being observed by interviewers. Respondents are
encouraged to verbalise the thought process they are using to answer the questions –
a technique referred to as ‘think aloud’. This think aloud is used along with a testing
protocol. The testing protocol specifies that the interviewers use concurrent and
retrospective questioning to further check respondents’ understanding of the questions
and the cognitive processes they use to select their answers.
Cognitive tests of the 2011 Census forms were conducted over four phases between
October 2008 and May 2009. Over 150 respondents were tested in locations
throughout the North and South Islands, which were spread between Statistics NZ
offices, respondent homes, and third-party venues.
There were 116 cognitive tests of the English language forms. These tests were
conducted over four rounds. Two rounds were in preparation for the pilot test and the
other two rounds were carried out in the lead up to the dress rehearsal. Each round of
testing was split into phases and the significant findings from a particular phase of
tests were incorporated into the next phase.
English form testing
Testing round
Dates
Tests
Regions
One
October 2008
24
Wellington
Canterbury
Two
December 2008–
February 2009
29
Wellington
Bay of Plenty
7
2011 Census Form Design Report
Testing round
Dates
Tests
Regions
Three
March 2009
25
Auckland
Four
April–May 2009
38
Canterbury
Otago
Southland
Wellington
There were also 36 cognitive tests of the Māori forms spread over three rounds. See
the ‘Māori form development’ section of this report for more detail about these tests.
Māori form testing
Testing round
Dates
Tests
Regions
One
January 2009
8
Gisborne
Wellington
Two
March–April 2009
18
Gisborne
Wellington
Three
June 2009
10
Wellington
Early testing rounds used forms very similar to the 2006 Census forms, with the aim of
observing and understanding the problems that were suggested during evaluation of
the 2006 Census data. By observing respondents completing the questions known to
have issues, the form design team were better able to start thinking of possible
solutions to those issues. From these early tests, the forms underwent an iterative
process of redesign, testing, and evaluation until the team made a final decision on
which versions of the questions would be included in the pilot test and dress rehearsal
forms.
Mass form completion
To make best use of visits to groups with lots of volunteers but little time, mass form
completions were conducted with a range of groups in locations throughout New
Zealand. At some venues where cognitive testing was being undertaken, forms were
made available to volunteers who couldn’t be cognitively tested because of time
constraints.
The mass completion exercises involved handing out abbreviated census forms for
people to complete while as few as one member of the form design team waited to
collect the completed forms. At some venues, a table was set up for people to ‘drop in’
and complete a form. Mass completion data was keyed into spreadsheets for analysis
of things like non-response rates to questions known to have non-response issues.
This technique is useful only for getting a feeling for how questions are working by
looking at a completed form. There is no opportunity to observe the respondent as
they answer or ask them about individual questions or problems they might have. The
technique was used mainly to test new concepts and look for some indication of item
non-response. Anything that seemed to work satisfactorily in mass completion was
tested further either in a future round of cognitive testing or in one of the field tests.
8
2011 Census Form Design Report
With the field testing programme for the 2011 Census limited to two field tests, mass
completion helped to get some idea of whether or not a strategy was worth pursuing,
but the findings from mass completion are not discussed in this report.
Online form usability tests
A small number of usability tests of the online forms were held with the objective of
testing a streamlined logon process and the usability of the redesigned Internet
Access Code. Given the overall design of the online forms were virtually unchanged
and the questions were extensively tested during the development of the paper forms,
form design and question developments were secondary objectives of the usability
testing.
Usability tests follow a similar structure to a cognitive interview. Respondents are
observed as they complete the forms and are encouraged to think aloud. Concurrent
and retrospective probing is also used to gain more insight into the response process.
Pilot test – March 2009
A pilot test was held on 24 March 2009 with a sample of approximately 1,200
dwellings selected from the Canterbury region. The pilot test offered respondents the
choice of completing forms online or on paper, but no Māori/English forms were
tested.
The primary objectives of the pilot test were to test the delivery and collection
strategies and aspects of the form design – it was not a full test of all census
processes. The ability to draw any firm conclusions on the form design was limited
given the very small and non-representative sample. For this reason, the findings from
the pilot test were mainly used to inform future rounds of cognitive testing, rather than
to suggest question changes were working. However, pilot test evaluation was able to
demonstrate form changes that were not worth pursuing.
Census dress rehearsal – March 2010
A dress rehearsal was held on 9 March 2010 with a sample of approximately 8,000
dwellings selected from the Auckland, Gisborne, Wellington, and Canterbury regions.
The dress rehearsal objectives were comprehensive: testing not only the forms (both
paper and online) but also the field procedures, and the scanning, processing,
evaluations, and outputs processes.
Unlike the pilot test, the dress rehearsal did test the distribution and uptake of the
Māori/English forms. In line with the overall Māori/English form strategy, residents in
the Gisborne dress rehearsal areas received Māori/English forms by default.
The dress rehearsal included a follow-up survey with a sample of dress rehearsal
respondents to gain their perspectives on a variety of issues, including questions
about the forms, online form usability, expectations about content, and what help they
sought and where they found that help.
Like the pilot test, the dress rehearsal was sampled in such a way as to test particular
aspects of the census process. This method of sampling, known as purposive
sampling, means that the sample is in no way representative of the population as a
whole. While firm conclusions about the effectiveness of form changes will have to
wait until after the 2011 Census itself, the dress rehearsal data evaluation process was
able to recommend changes in all but a few cases.
9
2011 Census Form Design Report
General changes to the 2011 Census forms
Government Statistician’s message
An obvious change from 2006 was the change of the Government Statistician, from
Brian Pink to Geoff Bascand. The form design team consulted with the Government
Statistician, the Census General Manager, and other internal stakeholders about the
2011 Census message.
This consultation resulted in no significant changes to the Government Statistician’s
message; however, subtle changes were made to the message regarding the:
• uses of census data
• retention of census forms
• use of census information for post-censal survey administration
• thank you statement.
For comparison, the two Government Statistician messages are shown below.
2006 Government Statistician’s
message
2011 Government Statistician’s
message
Filling in census forms is required by law.
Census information is needed for planning in
important areas such as education, health,
housing, business planning and investment.
It is also used to help us understand how our
society changes over time.
Filling in census forms is required by law.
Census information is needed for planning
vital public services such as education,
health, housing, and transport. It is also used
to help understand how our society changes
over time.
The information you provide must be kept
confidential by Statistics New Zealand and is
protected by the Statistics Act 1975. Census
information can only be used for statistical
purposes.
The information you provide must be kept
confidential by Statistics New Zealand and is
protected by the Statistics Act 1975. Census
information can only be used for statistical
purposes.
The Public Records Act 2005 requires
census forms be kept as historical records.
After 100 years census forms may be made
available for research that meets the
confidentiality requirements of the Statistics
Act.
The Public Records Act 2005 requires
census forms be retained. After 100 years
census forms may be made available for
research that meets the confidentiality
requirements of the Statistics Act.
Statistics New Zealand will also use census
responses to select a small sample of people
to participate in a survey on disability.
We will also use census responses to select
people for two surveys after the census.
Thank you for taking part in this important
census.
Thank you for your time and effort.
Note that the individual form messages are shown above. The dwelling form message is
identical except for the statement on post-censal surveys.
10
2011 Census Form Design Report
Office use boxes
Some business processes in the 2006 Census required the office use boxes to carry
both the Internet ID and three office use fields to be completed by the field staff. A
change in business process for the 2011 Census means the additional field staff fields
are not required, so the office use boxes have been streamlined. This improves the
overall look of the forms, as with the larger office use boxes there is less space to
accommodate the Government Statistician’s message and other communications
carried at the top of the forms.
On the individual form, this means that the office use box is now only the Internet ID
with the person number field (‘PER’) appearing in a lighter coloured field below the
Internet ID.
2006 Individual form office use box
2011 Individual form office use box
On the dwelling form, some field staff fields are still needed, but these have been
moved to a separate field staff use box, meaning the Internet ID is streamlined in
much the same way as the individual form.
11
2011 Census Form Design Report
2006 Dwelling form office use boxes
2011 Dwelling form office use box
A key benefit of this change is that the Internet ID is now contained in a single box,
without any other information confusing the respondent. Respondents are asked to
use their Internet ID when logging on to the online forms and when using the helpline,
so it is important that the Internet ID is easy for respondents to see.
Environmental sustainability messaging
Statistics NZ is conscious of the environmental impact of a data collection operation
that uses such large volumes of paper. For this reason, Statistics NZ requested that
the paper stock for census forms be sourced from environmentally sustainable
sources. To meet this aim Statistics NZ's print vendor, Wickliffe Limited, has sourced
paper that meets the requirements of the PEFC (Programme for the Endorsement of
Forest Certification) scheme.
PEFC certification requires that at least 70 percent of wood comes from PEFCcertified forests that meet or exceed PEFC’s Sustainability Benchmark, and that the
wood has come from controlled sources. For more information on PEFC certification,
see: www.pefc.org.
All 2011 Census forms carry an approved PEFC logo to let the public know that the
paper is responsibly sourced. The PEFC logo is shown below, in full size. On the
forms themselves, these logos have been scaled to the smallest allowable size, 27mm
wide.
12
2011 Census Form Design Report
Logo sized as they
appear on 2011
Census forms
Full-sized logo
Other changes to forms
General aesthetic elements for the forms were out of scope for the 2011 Census form
design project. For that reason, there were no changes to form sizes, colours, or
logos. Someone comparing the 2006 and 2011 forms for ‘look and feel’ would be left
with the impression that very little has changed. Some of the more subtle changes to
the forms are noted below.
Changes to typesetting
To accommodate questions that became ‘longer’ because of form changes, the top
margins of the forms have been set as 5mm in 2011 compared with 10mm in 2006.
Both the print vendor and processing team were consulted and endorsed this change.
Also, to make processing of religious affiliation data more efficient, the religions that
branch from the Christian response options now appear on a plain field with a blue
border. This replaces the 2006 approach where the religious affiliations appeared on a
shaded background. This shading was causing a number of forms with no response to
these fields to be misrecognised as responses. This change is intended to prevent this
issue.
Census dateline
The census dateline that appears on the forms has traditionally stated just the date,
but for 2011, the day, Tuesday, will also appear in the dateline, ie Tuesday, 8 March
2011.
Census website promotion
Key to the 2011 Census strategy is the decision to promote the online forms as the
preferred response option for users who can use the Internet. For example, in 2006, a
promotion of the helpline appeared before the census website, but in 2011, the
website promotion will appear before the helpline promotion.
13
2011 Census Form Design Report
Māori/English form front page
The front page of the Māori/English forms for the 2006 Census were either all-brown
or all-blue depending on whether it was a dwelling or individual form. As the Māori
language appears on a purple background within the form, the decision was made to
print the Māori instruction text for the 2011 Census forms on a purple background to
be consistent with the approach within the form.
14
2011 Census Form Design Report
Changes to questions in the 2011 Census forms
The following section shows and discusses the changes that have been made to
questions in the 2011 Census individual and dwelling forms. For each change, images
of the 2006 and 2011 questions show the differences. Discussion follows the images
and gives:
• a text description of the change
• the rationale for the change
• evidence gathered in pretesting to support the change being made
• recommendations for future development.
Each change described shows only the English language change, but the equivalent
changes have also been made to the Māori language forms.
Similarly, where necessary, all changes described have also been made in the online
forms. In the case of the name and Māori descent questions, changes have not been
necessary in the online forms due to the different ways the online form deals with text
responses and routing instructions.
The discussions are not intended to be exhaustive accounts of all the form
redevelopment efforts and generally focus only on the changes as finally implemented
for the 2011 Census. Many questions went through a number of iterations of
development and testing before deciding the approach for census. More information
can be made available on request.
Individual form changes
Question 2 – Name
2006 Census question
2011 Census question
Description of change
The first and last name fields are now offered as separate constrained-text response
fields with space for 26 characters each. These replace the free-text response spaces
offered in 2006.
Rationale for change
The name question comes directly after the ‘How to answer’ instructions where a
respondent is shown an example of writing a response in constrained-text response
15
2011 Census Form Design Report
fields. This change now makes the task match the earlier instruction of how to answer.
The names entered in the individual forms are matched to the names in the dwelling
forms (question 6) to ensure that all the forms for a dwelling have been collected. In
free-text spaces, the quality of printing is not as high as it is in constrained spaces.
Improving the quality of printing in the name field will make these processing checks
more efficient.
Evidence supporting change
A review of other New Zealand official forms – for example Inland Revenue tax
returns, New Zealand Passenger Arrival Cards – showed that constrained-text
response fields for names are in common use. Further, a review of practices in
overseas census forms showed that these typically use constrained-text response
fields for name data. Countries that follow this practice include Australia, England, the
United States, and Canada. From these reviews it was seen that constrained-text
response fields for names were both common practice in other official New Zealand
forms, and consistent with census form design practice in comparable countries.
Note that due to space constraints it was not possible to use constrained-text
response fields for the names required in the dwelling form in question 6.
Question 9 – Birthplace
2006 Census question
2011 Census question
Description of change
India has replaced Scotland as a country of birth response option.
China moves up in the response option list, with India below China.
Rationale for change
Following each census, the response options for the birthplace question are revised
based on counts from the most recent census. With the exception of the Cook Islands
– which is included for reasons of data quality – the countries offered as choices are
the seven most common responses from the previous census.
16
2011 Census Form Design Report
The rationale for this approach includes trying to:
• reduce overall respondent burden by providing the most common countries of
birth in a list of choices to mark, rather than having respondents write their
response in
• reduce processing time and costs by reducing written-in country-of-birth
responses. (Extra resources are needed to code written-in responses correctly.)
2006 Census data showed that India has become one of the seven most common
countries of birth, while Scotland no longer makes the top seven. For this reason, in
the 2011 Census form, India replaces Scotland as a country-of-birth choice.
Note that China has moved up to take the fourth space, while India is below China.
The counts for respondents born in China justify a higher position in the response
option listing as this will reduce respondent burden.
Cognitive testing findings
Cognitive testing did not specifically target people born in Scotland, but one phase of
testing did focus on overseas-born respondents – none were from Scotland. However,
the respondents born in India that were cognitively tested reported they were pleased
to be able to mark a response rather than write one in.
Dress rehearsal findings
The reasons for changing the response options are compelling, but a potential risk is
compromising the response quality for the countries that make up the United Kingdom
(Scotland, Wales, and Northern Ireland).
Evaluations of dress rehearsal data looked specifically at any United Kingdom-type
responses (for example, ‘UK’, ‘Britain’), and whether these became more prevalent
with only ‘England’ in the list. Evaluations found that, as expected, only a very small
number of respondents gave United Kingdom-type responses in the dress rehearsal.
Further, the rate of written-in Scotland responses in the dress rehearsal was in line
with expectations based on 2006 Census data where the option appeared in the listing
of countries.
India responses in the dress rehearsal were broadly in line with expectations based on
2006 Census data.
With no evidence to suggest response quality was being compromised, India having a
tick-box instead of Scotland is justified based on relative counts.
Future development
Analysis of 2011 Census data may show another country of birth has emerged to
displace one of the countries of birth offered as a response option in the 2011 Census
forms. A similar process of identifying the risks and benefits of changing the listing
should be undertaken before implementing any change.
17
2011 Census Form Design Report
Question 14 – Māori descent
2006 Census question
2011 Census question
Description of change
Both the instruction to “Look for the ‘go to’ instruction after you answer the question”
and the extended routing instruction “mark your answer and go to …” have been
removed. Instead, a prompt to remind respondents how to mark their answers has
been included.
Rationale for change
Māori descent has historically had a high non-response rate.
The layout and wording of the Māori descent question was changed quite markedly
between the 2001 and 2006 Censuses in an effort to address high rates of nonresponse. One device used in 2006 was the addition of introductory text telling the
respondent to “Look for the ‘go to’ instruction after you answer the question”. This was
intended to help respondents route correctly; however, the non-response rate did not
improve.
Māori descent is an important variable that is used in conjunction with electoral
registration data to calculate Māori and general electoral populations and determine
the number of electorates.
Statistics NZ also uses Māori descent data to form the subject population for published
iwi affiliation data. Respondents who answer ‘yes’ to Māori descent make up the
subject population while ‘no’, ‘don’t know’ and non-responses are excluded from the
subject population for iwi affiliation – even if they then go on to give an iwi response. In
practice, this means for an iwi response to be counted in standard Statistics NZ
outputs about iwi, a respondent must answer ‘yes’ to the Māori descent question. In
2006 compared with 2001, an increased number of respondents gave no response to
Māori descent but went on to give an iwi response. This question change aims to
reduce this rate of non-response to lower than 2001 levels.
Cognitive testing findings
Initial cognitive tests, with the question unchanged, hinted that while the overall design
of the question was good for respondents, the “Look for the ‘go to’ instruction …”
sentence was perhaps too successful in getting respondents to look for where to go.
Respondents were observed simply going to where they were directed by the
instruction and continuing without marking the response oval. This was suspected to
be one of the behaviours contributing to non-response to this question.
A version of the question without the “Look for …” instruction was tested and while it
appeared that respondents were taking more care with actually marking the response
to the question, some respondents were still observed moving on without marking.
Some people seemed to like to read everything before marking their answers.
18
2011 Census Form Design Report
A question was then trialled with the shorter ‘go to’ routing instruction. This question,
with both the shortened routing and omitting the “Look for the go to …” instruction,
performed well through a series of cognitive tests and was implemented as the
question for dress rehearsal.
Dress rehearsal findings
In the dress rehearsal, the non-response rate to the Māori descent question improved.
The dress rehearsal non-response rate was significantly lower than for the comparable
meshblocks from the 2006 Census (meshblocks are the smallest geographic units for
which data is collected).
However, over 30 percent of forms were completed online, where the difference in
form design means item non-response is typically lower than for paper forms. Once
this impact is taken into account, the non-response rate in dress rehearsal was still
marginally better than in 2006, though it appears some of the improvement is due to
the high rate of Internet response.
While the evidence was not overwhelmingly in favour of the modified question, the
question was certainly not worse and has the potential to perform better than the 2006
question.
Future development
Evaluations of 2011 Census data should determine whether the change to the
question lowers the rate of non-response to this question.
The greater take up of the online form is expected to result in a lower non-response
rate overall, but it will still be worthwhile specifically evaluating the non-response rates
in paper forms to assess the effectiveness of the question change.
Questions 16 and 17 – Health problem or condition, and
Disability
2006 Census question
2011 Census question
Description of change
Question 16 – The response options have been updated to better match the screening
questionnaire administered to people who are selected for the post-censal Disability
Survey. Also, one minor update has been made to the question text to help the
question flow better into the response options.
19
2011 Census Form Design Report
Question 17 – The outdated term ‘handicap’ has been removed from the question text
and a clause has been added that the disability must stop the respondent from doing
everyday things.
Rationale for change
A Household Disability Survey will follow the 2011 Census, as was the case in the
2006 Census. For that reason, the 2011 Census will continue to have questions on
disability.
The Disability Survey team engaged the census form design team to investigate the
possibility of developing new disability questions with the following aims:
1. To revise the question wording with language more appropriate for 2011.
2. To remove the requirement for respondents to compare themselves with
people their own age.
3. To improve the match between census form responses and subsequent
Disability Survey responses and, in turn, improve sampling efficiency.
4. To improve data quality to the point that disability data may be output.
The questions went through several evolutions before the final wording for the 2011
Census was reached. Other designs were developed, tested, and discarded for
reasons including:
• not meeting the needs of the topic experts
• respondent confusion
• unacceptable rates of people having health conditions not identifying as such in
the question.
Cognitive testing findings
Results from cognitive testing were encouraging. Some testing focused on
respondents known to have some form of health problem, and these tests got very low
rates of false-negative response. Those respondents who did have health problems or
conditions were able to use the listed options to determine the things they had difficulty
with. Also, there was evidence of older people counting age-related conditions, which
was one of the aims of the redevelopment.
Some cognitive tests included the administration of the screening questionnaire for the
Disability Survey at the end of the cognitive test. These tests intended to investigate
how well the responses to the census disability questions translated to correct
screening in or out of the Disability Survey. Results from these tests suggested a close
match and accurate screening between the conditions listed and the content of the
screening questionnaire. Results from cognitive testing and the small-scale pilot test
were encouraging enough to trial the new questions in the dress rehearsal.
Dress rehearsal findings
The Disability Survey team evaluated the dress rehearsal data on behalf of the form
design team. Evaluations focused on the effectiveness of the dress rehearsal
questions as screeners for the Disability Survey, but some analysis of question
response was also undertaken.
Comparing dress rehearsal with 2006 Census data was of limited value since the
questions had changed so much; however, comparisons of the derived disability
indicator showed slightly lower rates of disability in the dress rehearsal. This, in itself,
is not of concern, as the key thing is how effectively the question is performing as a
screener for the Disability Survey.
20
2011 Census Form Design Report
The method used to evaluate the question’s effectiveness as a screener was to make
contact with a sample of the dress rehearsal participants and administer the screening
questionnaire. This method more closely replicated the actual post-censal survey
processes than previous testing because:
1. the screening questionnaire was administered by telephone
2. there was a gap of several weeks between the dress rehearsal and
participation in the follow-up.
Samples of respondents who indicated they had a disability, and a smaller sample of
those who didn’t, were contacted with the aim of establishing rates of false-positive
and false-negative responses to the dress rehearsal disability questions. Results of
this exercise were encouraging, but numbers in the dress rehearsal sample were too
small to draw any firm conclusions. While this testing was unable to conclusively prove
the new questions were better than in 2006, there was no evidence to suggest they
were any worse, so the decision was made to proceed with the new questions for the
2011 Census.
Future development
Following the 2011 Household Disability Survey there will be enough information to
determine whether the changed questions are performing more or less effectively as
screeners for drawing the Disability Survey sample.
It is also possible that Disability Survey data will give a clear indication of the quality of
the census disability data and help determine if it may be output.
Question 19 – Living arrangements
2006 Census question
2011 Census question
Description of change
Text for the same-sex and opposite-sex civil union partner response options have
been updated with the term ‘legally registered’.
In addition to the wording change, a guide note bubble has been included to direct
respondents to an extended guide note that further defines a civil union. A shorter
guide note was included in the 2006 guide notes, but it was not accompanied by a
bubble within the form.
21
2011 Census Form Design Report
Rationale for change
The rationale for including the term ‘legally registered’ is fully explained in the next
section on legally registered relationship status. Once the decision was made to
include the term in question 23, the term was introduced here for consistency.
While the main data quality issue addressed through the guide notes was in relation to
question 23, the layout of question 23 was unable to accommodate a guide note
bubble. Question 19 was better able to accommodate the guide note bubble and as
this is the first place in the form to refer to civil unions; it makes sense to direct
respondents to the guide notes from here.
Cognitive testing findings
This question was not formally investigated in cognitive testing; the change was
needed for consistency with question 23.
Cognitive testing found no respondents who stated they were living with their civil
union partner except for the few tests conducted that specifically targeted same-sex
civil union partners – these respondents did state their living arrangements correctly.
Dress rehearsal findings
This question was not specifically evaluated following the dress rehearsal, but was
used to help evaluate legally registered relationship data.
Question 23 – Legally registered relationship status
2006 Census question
2011 Census question
Description of changes
The question text and information box now include the term ‘legally registered’.
Other changes to the question text are ‘best describes’ replacing ‘is true’ and including
the word ‘current’.
In the response option text, ‘legally registered in a civil union’ has replaced instances
of ‘legally joined in a civil union’. Also, the term ‘bereaved’ has been replaced with
‘surviving’ in the third response option.
22
2011 Census Form Design Report
‘Civil union’ has been added to the divorced/dissolved response option text.
Rationale for change
There were some data quality concerns with this question from 2006. The main issue
was with the significantly higher than expected counts for civil unions compared with
registration figures. Another focus was on reducing multiple responses to this
question.
As already discussed, the 2011 form includes a guide note bubble in question 19
where civil unions are first mentioned. From the guide notes, respondents are given
information (for question 19 and 23) that will help them to answer these questions
accurately.
The question and response option wording has been updated with the terms ‘legally
registered’. This change was made to conform to the legally registered relationship
standard but, more importantly, to convey to respondents that civil unions require a
legal action by a couple and that civil union status can’t just be assumed after a certain
time. This wording change is intended to address some of the misunderstanding
associated with de facto relationships being counted as civil unions.
In an attempt to address multiple response, the phrase ‘Which one of these
statements best describes …’ replaces the 2006 wording, ‘Which one of these
statements is true …’. This new wording better suggests to the respondent that only
one option is to be selected. The addition of the word ‘current’ is also intended to
further reinforce that only a single response is required.
This question was also subject to a classification review, meaning some change was
needed so that the words conformed to the classification change and reflected real
world change. The real world change meant including ‘civil union’ in the
divorced/dissolved response option. In 2006, civil unions were so new that it was not
possible for a civil union to have been dissolved, which is no longer the case.
Cognitive testing findings
This question was first tested unchanged to help understand what was causing civil
union data to be so at odds with data from civil union registrations. This testing
showed that there was a lack of understanding about exactly what a civil union was
across all respondent types. Having said that, for the majority of respondents, this lack
of understanding was of no consequence as civil union was, correctly, not an option
they were seriously considering.
Testing only found a few respondents that even considered marking civil union as their
response. Typically, these people had never been married but lived with a partner and
were looking for the best way to answer. These respondents were not clear that this
question is concerned only with legally registered relationship status. Even those
respondents who understood that the question only concerned legally registered
relationship status wanted somewhere to note their relationship.
Introducing the term ‘legally registered’ to the question wording and response options
did have some effect on respondent understanding, but overall there remained a poor
understanding of exactly what a civil union was.
Four cognitive tests were undertaken with respondents known to be in a same-sex civil
union. As might be expected, these respondents answered correctly and understood
exactly what a civil union was. These respondents were asked for their perspective on
the change from ‘bereaved civil union partner’ to ‘surviving civil union partner’. As
there was no unanimous feeling for either term, the wording as it is in the legally
registered relationship standard is used – ‘surviving civil union partner’.
23
2011 Census Form Design Report
Dress rehearsal findings
The dress rehearsal data showed there were still far more civil unions than might be
expected. Comparing responses to the legally registered relationship status and living
arrangements questions suggested that while some civil union responses were
genuine, most were almost certainly not true civil unions.
Evaluations of dress rehearsal data suggested that, as in 2006, civil union data is
unlikely to be fit for use in 2011. However, dress rehearsal data did show the data was
in line with expectations for all other relationship status categories.
Incidences of multiple responses to the question were lower than in 2006; however, it
is not clear whether this was because of the changes to the question wording or simply
down to the characteristics of a dress rehearsal.
Future development
Evaluations of 2011 Census data will determine whether the civil union data collected
in this question has improved.
Future census developments may further investigate this topic to gauge the
importance of it to users and determine the best way to obtain data on legally
registered relationship status. This may involve further exploring the merging of the
legally married and civil union categories. This option was not pursued in the 2011
form development due to stakeholder discomfort at affecting ‘legally married’ data,
which continues to be of acceptable quality. There were also concerns about possible
negative reactions from respondents and about respondents selecting a combined
legally married/civil union response option when they were neither of these, but rather
were in a de facto relationship. Rigorous testing of respondent attitudes toward
merging these two response options would be needed before a change like this could
be made.
Respondent confusion between ‘civil unions’ and ‘civil ceremonies’ may reduce due to
the revised civil union guide notes and as awareness of civil unions increases
generally. This area will benefit from further investigation in the next census form
development phase.
24
2011 Census Form Design Report
Question 26 – Highest school qualification
2006 Census question
2011 Census question
Description of change
The fourth response option, which did have ‘NZ Scholarship level 4’ as an option has
had the ‘level 4’ removed, so is now just ‘NZ Scholarship’.
Rationale for change
NZ Scholarship is no longer classed as a level four qualification. Removing the ‘level 4’
part of the response option means that anyone who earned the qualification can
answer it regardless of whether they earned the qualification as a level 4 qualification
or not. The alternative would have been to have both ‘NZ Scholarship level 4’ and ‘NZ
Scholarship’ in the list of options, which would have been clumsy.
Evidence supporting change
This question was not formally investigated in cognitive testing or evaluated in the
dress rehearsal, as it was simply an update for real world change. Cognitive testing did
show that this question has a high level of cognitive load associated with the
complexity of the response categories, but this is an existing issue with the question
that has not been increased due to the change.
Future development
Secondary school qualifications are particularly volatile to real world change. This
question should be monitored for real world change and, if further changes are made
to the qualifications, consideration must be given to redesigning this question to
accommodate the complexity and length of this response options for it.
25
2011 Census Form Design Report
Question 31 – Total personal income
2006 Census question
2011 Census question
Description of change
Both the ‘$50,001 – $70,000’ and ‘$100,001 or more’ income bands from 2006 have
been disaggregated into two bands.
Rationale for change
Incomes have generally increased over time meaning that aggregations that were
once appropriate are now becoming less useful for data users. In the 2006 Census,
the ‘$30,001–$40,000’ band was disaggregated into two bands with positive feedback
from data users. Data users demonstrated a need for further disaggregation to provide
them with better detail for those in higher income groups, while also retaining the level
of data collected for lower income groups.
Changes to the typesetting made it possible to incorporate the disaggregated income
bands without losing detail. (Originally, it seemed that accommodating the
disaggregated income bands would mean aggregating some of the lower income
bands. This solution would not have been an ideal because while users needed the
extra detail offered by disaggregation, they did not want to sacrifice existing detail.)
Cognitive testing findings
Cognitive testing showed that while respondents had some difficulties with this
question, none of these were related to the newly disaggregated income bands. The
difficulties that arose were related to the perception that this question is intrusive and
that it is difficult to calculate an annual income from weekly or fortnightly wages,
salaries, or other income sources. In those tests where guide notes were made
available, respondents who had difficulties in calculating an annual income did make
use of the guide notes.
26
2011 Census Form Design Report
Dress rehearsal findings
Evaluations of dress rehearsal data showed that the income distribution data was
broadly consistent with expectations. Comparing dress rehearsal data with previous
census data showed growth in income bands that might be expected given the
passing of time.
Comparisons with more current Household Economic Survey (HES) data were also
made. These comparisons showed that the dress rehearsal data had higher
percentages than expected in the two highest income brackets, with variations
observed in other brackets as well. Rather than suggesting a problem with the
question, this more than likely reflects the nature of the dress rehearsal sample. For
example, apartments were specifically targeted in the dress rehearsal and apartment
dwellers are known to have higher than average salaries.
The dress rehearsal follow-up survey asked respondents whether they used the guide
notes and, if so, what for. The most common reason they reported was to use the
income calculation table.
Future development
Data users continue to want more and more detail from this question. In both 2006 and
2011 there has been disaggregation of previously broader income bands. Should
requests for more detail come in 2016, the current placement in the paper form will
make further disaggregation problematic. The next census development may need to
look at new ways to ask this question that will accommodate data user needs within
the constraints of the paper form design.
Dwelling form changes
Question 3 – Number of occupants
2006 Census question
2011 Census question
Description of change
The information box between the question text and the response space has been
reworded slightly.
The reference to the Internet and paper forms has swapped order.
Rationale for change
In 2006, the first bullet point reminds the respondent to count babies, whereas in other
census materials, respondents are reminded that babies, children, and visitors
spending the night of census in a dwelling must be counted there. As this is the most
visible message for helping respondents to correctly count the number of occupants, it
was felt that this message should carry the full inclusions list.
27
2011 Census Form Design Report
The reference to Internet and paper forms is reversed for consistency with the
approach elsewhere in the forms.
Evidence supporting change
This change was not tested in cognitive testing or in the dress rehearsal. However, key
stakeholders who were consulted agreed this was a good change with little chance for
a negative effect on data quality.
Question 4 – Dwelling description
2006 Census question
2011 Census question
Description of change
The question text now affirms that the dwelling being asked about is the same as ‘…
the dwelling given in question 2’.
The response options have been updated first with ‘a’ before the option to help the
response options flow on from the new question. Also, the key distinction between the
first two response options, ‘not joined’ versus ‘joined’, has been bolded to emphasise
that this is the type of information required for this question.
Rationale for change
Evaluations of 2006 Census data suggested some data quality issues with this
question, including uncertainty about what constitutes a ‘dwelling’ and what
information to provide when answering this question.
Referencing respondents back to the address question is designed to help them focus
on their part of a larger building where this is appropriate, since in the address
question they are asked to give the full address, including a flat number if relevant.
Bolding the words ‘joined’ and ‘not joined’ helps to highlight the key difference between
response options one and two and puts the focus on the dwelling structure. Also,
removing the bracket around ‘not joined’ helps the respondent to read the clause
correctly – as brackets are often a cue to a reader that they can stop reading.
Also, in 2006, a significant level of ‘other’ responses gave information that was not
required (eg that the dwelling was brick), but did not give information that was required
– whether it was separate or joined. The question change was intended to reduce the
number of these types of responses.
Cognitive testing findings
In early rounds of testing, using unchanged questions, it was observed that
respondents were answering question 2 with a good degree of accuracy but were then
having trouble with question 4. This was especially true for people in the ‘joined’ types
of dwellings. For example, people in a joined dwelling would think of the specific part
of the building (eg apartment 21A) that made up their dwelling when responding to
28
2011 Census Form Design Report
question 2. However, when presented with question 4, respondents began to think of
the overall structure when asked about the ‘dwelling’.
This confusion led to trying question wording that references the dwelling as that given
in question 2. Respondents were seen referring back to question 2 before marking
their answer to question 4. For many people in a typical ‘not joined’ dwelling type this
was simply a quick glance without a second thought before answering. For those in a
‘joined’ dwelling type, in some cases it did make the respondent think a little more
before answering. Respondents’ answers, both to the question itself and to additional
probe questions, suggested that the change had a positive effect on some, but not all,
people in a complex joined dwelling.
Dress rehearsal findings
Evaluations of dress rehearsal data suggested that the form changes were having
some positive effects on data quality. One effect of the form change appears to be a
slightly higher incidence of respondents writing in an ‘other’ response. This in itself is
not desirable; however, almost all these written-in responses to ‘other’ contained
enough detail for the response to be classified as separate or joined. This is a
significant improvement on 2006 where many written-in responses could not be
classified as separate or joined, resulting in lower data quality.
Future development
Evaluations of 2011 Census data should determine if the change to the question has
had a positive impact on data quality.
Question 5 – Number of storeys
2006 Census question
2011 Census question
Description of change
The words ‘(single level)’ have been added to the first-response option text to clarify
the meaning of ‘one storey’.
Rationale for change
2006 Census data showed an unfeasibly large number of ‘none of these’ responses.
The bracketed words ‘single level’ have been added to help convey the meaning of
one storey for people who do not naturally think of their dwelling as having storeys.
Cognitive testing findings
While the majority of people tested answered this question with no problems at all,
cognitive testing with the question unchanged from 2006 confirmed that some people
known to live in a one-storey dwelling thought of their dwelling as having no storeys.
When the term ‘single level’ was added, some people who had no difficulty with the
question did express some surprise at the need for clarification. However, for the small
29
2011 Census Form Design Report
number of people who did have second thoughts about how to answer, the addition of
‘single level’ did appear to help.
Dress rehearsal findings
Though the dress rehearsal sample was too small and unrepresentative to make firm
conclusions, the addition of the term ‘single level’ does appear to be of some help.
Dress rehearsal data for this question showed a reduction in ‘none of these’ responses
while the percentage of one-storey responses was up slightly compared with 2006.
While it is impossible to be sure, this could be due to respondents who would have
otherwise given a ‘none of these’ response now marking the ‘one storey (single level)’
response option due to the addition of the words ‘single level’. The rates of ‘two or
three storeys’ responses also went up slightly in the dress rehearsal areas. This may
suggest that some two-storey dwellings were being counted as one storey in 2006 and
are now being counted correctly.
Future development
Evaluations of 2011 Census data should determine if the change to the question has
improved data quality.
Question 12 – Amount of rent paid
2006 Census question
2011 Census question
Description of change
The question wording has been updated and a typesetting change made to encourage
improved response quality.
Rationale for change
In the 2006 Census, a number of respondents provided weekly rent amounts that were
more than expected and needed further investigation in data processing.
Investigations showed a number of respondents were entering their own decimal
points and reporting the number of cents as well as dollars. Scanners were not
recognising the decimal points and so, for example, responses such as $300.95 were
being recognised as $30095.
Being more explicit about ‘to the nearest dollar’ is intended to discourage reporting of
the number of cents. Also, the typesetting has been modified with a comma to better
demonstrate where figures should be entered. Finally, the decimal point separating the
‘dollars’ fields from the pre-filled ‘cents’ fields no longer appears in its own constrained
box. This is intended to give the respondent less white space in which to be tempted to
write something.
30
2011 Census Form Design Report
Cognitive testing findings
Cognitive testing did not find any respondents exhibiting the behaviours observed in
the 2006 Census. Because of the artificial nature of cognitive testing and the fact
cognitive test volunteers tend to be ‘good’ form fillers, the absence of errors cannot be
taken as proof that the change is working. Only around half of cognitive testing
respondents were renting their dwelling.
Dress rehearsal findings
This dress rehearsal showed there were a very small number of unlikely rent amounts
given. The majority of these were explained by a long rent period being given.
Processing analysts viewing the completed forms noted no incidences of respondents
entering decimal points. The remaining unlikely rent amounts in dress rehearsal may
be because of spurious responses.
Future development
Evaluations of 2011 Census data should determine if the change to the question has
had a positive impact on data quality, particularly in terms of the number of responses
needing data repair.
Question 18 – Number of motor vehicles
2006 Census question
2011 Census question
Description of change
Extra information has been added to the instruction box between the question and
response options to provide more detail about the vehicle types that should not be
counted.
Rationale for change
The statistical standard for motor vehicles provides a lot of detail about the types of
vehicles that count and don’t count for the purposes of the standard. A lot of this detail
is not appropriate for an instructional bubble, but this is available to respondents via an
online guide note. Two increasingly important exclusions are being highlighted
however.
First, the exclusion of vehicles for work has been extended to include ‘farm only’
vehicles. Specialised farm vehicles, such as tractors, should not be counted as well as
any vehicles that are not legally allowed on the open road.
The exclusion regarding motorbikes has been clarified by also noting that scooters are
not to be counted. An instruction not to count scooters was included in the 2001
Census but was removed in 2006. But because of a marked increase in scooter
registrations since 2006, the instruction to exclude scooters has been reintroduced for
data quality purposes.
31
2011 Census Form Design Report
Cognitive testing findings
For most people this question was straightforward. Answers to the probe “what
vehicles were you counting?” usually got answers related to who in the household the
vehicle belonged to. Most households tested only had private cars.
Those households that did have work vehicles usually included or excluded them
correctly based on whether they were allowed to use them for private purposes. Most
employees who had work cars reported that they were allowed to use them for
personal use.
The group that had most difficulty with this question was self-employed people for
whom the distinction between work and private use is particularly blurred. In these
cases, respondents would give the question a great deal of thought and generally
came to sound conclusions, but the cognitive load was higher than for other
respondents with work cars.
When probed to find out what vehicles people were counting, no one reported that
they were counting a scooter. Those who did have scooters and did not count them
reported that they had seen the note not to count scooters. Some respondents felt that
having the instruction to not count motorbikes in both the question and the instruction
box was excessive.
Dress rehearsal findings
This question was not formally evaluated except to compare the distribution of
responses for comparability with 2006 data. The data distributions were similar, with a
slightly lower percentage of households reporting having three or more vehicles than
in 2006.
Non-response to the question remained at the same low level observed in 2006.
Future development
Evaluations of 2011 Census data should provide some indication of whether the
changes to the question have helped to improve data quality.
Also, a need to review the criteria for inclusion and exclusion has been identified. This
is to ensure that the data collected continues to be relevant.
32
2011 Census Form Design Report
2011 Census Māori/English form development
Background
Statistics NZ has offered census respondents the option of census forms in Māori
since 1996 when separate forms in Māori were produced as well as the English-only
forms.
In 2001, a side-by-side language format was introduced that presented both
languages on one form: the Māori on one side mirrored the English on the other side.
This allowed respondents to easily compare the Māori and English language used.
This format was continued in 2006 and guide notes were incorporated into the forms.
In 2006, the census forms were also available online in English and Māori. The Māori
online forms featured a ‘hover’ function to mimic the side-by-side format of the paper
forms. Respondents could hover the cursor over the Māori to see the English
translations of the text.
The 2011 Census continues to employ the side-by-side format of language in the
Māori/English paper forms. For the Māori online forms, the hover translation function
has been replaced with a ‘toggle’ function for each question. The ‘Online form
development’ section has more discussion of this feature.
2011 form development
Development environment
The Māori language forms were subject to the minimal change strategy that
underpinned the English forms development and 2011 Census development as a
whole.
As in 2001 and 2006, an in-house questionnaire designer with Māori language skills
worked alongside the English language questionnaire designers. Also, for quality
assurance, two Māori language experts (certified by Te Taura Whiri i te Reo Māori/The
Māori Language Commission) were contracted to translate and review the Māori
content.
Development objectives
In the development of the Māori content of the Māori/English forms, there were two
main objectives:
• Understandability – Māori language respondents have a range of proficiency
levels. New questions and text needed to be as simple and understandable as
possible for the widest range of Māori language respondents.
• Conceptual equivalence – Information collected from the Māori and English
questions must produce comparable responses if the data are to be meaningful.
That is, questions need to convey and measure equivalent concepts across
Māori and English.
Review of the 2006 Māori language forms
As well as 2011 Census content changes, a decision was made that an overall review
of the 2006 Māori/English forms would be undertaken. The review was a quality
33
2011 Census Form Design Report
assurance measure to ensure that the language used was consistent across the forms
and that no errors had been introduced.
The review identified several issues to be addressed in the 2011 environment. These
were:
• typographic and grammatical errors
• inconsistent phrasing and spelling
• obscure or out-of-date terminology
• conceptual differences between Māori and English questions.
Recommendations for change were subject to the same change consultation process
as those for the English forms.
Testing
The Māori/English forms were tested using the same methods as for the English
forms, with the exclusion of the pilot test, which only tested the English forms. More
detail on the overall testing methodology is explained in the ‘Testing overview’ section.
Cognitive testing
Cognitive testing is a vital component of quality assurance in the questionnaire
development process. In the context of the Māori/English forms, cognitive testing is
particularly important as it serves the dual purpose of testing the two main
development objectives of understandability and conceptual equivalence with the
English questions. Also, cognitive testing is useful for identifying previously undetected
issues with existing language.
Thirty-six cognitive tests of the Māori/English forms were undertaken over three testing
rounds from January–June 2009. Interviewees were from a variety of ages,
backgrounds, regions, and ethnicities (though the majority were Māori). Both men and
women were interviewed. Respondents reported a range of proficiencies in Māori
language, from being able to understand only basic language (instructions and simple
phrases) through to those who said they were able to understand nearly everything in
Māori.
Of the 36 tests, 13 were conducted blind ie in Māori only. This meant the interviews
were conducted in Māori, using only the Māori content of the form and with the testing
protocol in Māori. The remainder of the tests were undertaken using the Māori/English
forms, where respondents could refer to the English questionnaire as needed.
In each of the testing rounds, questions that had changed were tested on respondents
to make sure that they worked in Māori. For example, first round testing of the new
wording in Māori for the disability questions established that respondents found the
questions hard to distinguish from each other, ie they thought the questions were
asking the same thing in two different ways. The questions were reworded and
retested. Testing of the new wording showed that respondents were answering the
questions as intended and that most of them no longer had difficulties with the
questions.
Blind testing showed that proficient Māori speakers encountered few problems with
comprehension of the Māori used in the forms. The main area of difficulty was with
unfamiliar terms, eg educational qualification terms, and new or changed terminology,
eg translation of ‘civil union’.
34
2011 Census Form Design Report
Detailed information on questions that were changed in the Māori/English forms can
be found in the sections on ‘Individual form changes’ and ‘Dwelling form changes’.
Some examples of changes to Māori language for 2011
2006 Census issue
2011 Census solution
Rationale
tāwāhi
‘Tāwāhi ‘was adopted
because Te Taura Whiri
advice was that tāwāhi is
more commonly used than
rāwāhi.
hononga ā-ture
Advice from Te Taura Whiri
certified language advisors
that the 2006 term was out-ofdate.
hīkoi
The spelling was incorrect as
a macron was missing.
whare wīra
Cognitive testing showed
respondents lacked
understanding of the 2006
terminology, so it was
replaced with a more
commonly understood term.
Ngā rūma whai taonga
moe... (rooms furnished as
bedrooms...)
The English question asks for
rooms furnished as
bedrooms, but the Māori
asked for rooms used as
bedrooms – a different
measure. This was changed
so the questions are
conceptually equivalent.
Tohua te katoa o raro iho nei
e hāngai ana ki a koe. (Mark
all of the below that apply to
you.)
The instruction wording was
different from phrasing used
elsewhere in the form. As the
question was being changed,
it was possible to change the
phrasing to be consistent
with other questions.
Inconsistent spelling
rāwāhi/tāwāhi (two different
dialectical spellings for
overseas)
Out-of-date terminology
hononga tangata (civil union)
Misspelling
hikoi (walk)
Obscure terminology
whare tāwhai (caravan)
Conceptual difference
Ngā rūma e whakamahia ana
hei rūma moe... (Rooms used
as bedrooms...)
Inconsistent phrasing
Tohua te katoa o raro nei e
pa ana ki a koe. (Mark all of
the below that apply to you.)
35
2011 Census Form Design Report
Incorrect grammar
Ehara ko tētahi o ēnei (None
of these)
Ehara i tētahi o ēnei
Te Taura Whiri-certified
language advisors strongly
recommended correcting this
grammar.
Summary and recommendations
Overall, the 2011 Census form design project made quality improvements to the
Māori/English forms.
Respondents were for the most part – subject to their Māori language proficiency –
able to complete the Māori/English forms competently. As noted in 2006, unfamiliarity
with bureaucratic terms and new terminology continued to cause the most difficulty for
respondents.
The side-by-side Māori/English format allows respondents to easily check their
comprehension of the Māori content and questions as they complete the forms. This is
an important feature that is proven to work well for respondents and the retention of
this format is recommended for the 2016 paper forms.
Having an in-house questionnaire designer with Māori language skills working in the
form design team was crucial to the smooth development process for the
Māori/English forms. It allowed content in Māori to be assessed and revised in a timely
manner, and made the mediation of the external translation and review process
straightforward. Again, it is highly recommended that a Māori language specialist is
involved from early in the 2016 development process.
36
2011 Census Form Design Report
2011 Census guide note development
Background
The objective of the census form design team is to make the forms as easy as
possible for the majority of respondents to complete without the need to seek further
help or explanation. However, over successive censuses, guide notes have been
provided with the forms to give help with questions known to cause respondents the
most difficulty.
Guide notes are provided in English and in Māori, in print and in online formats.
Print guide notes – English
The English print guide notes for 2011 appear on a single sheet of A3 paper folded
into thirds. This approach is the same as taken in 2006. The guide notes carry
messages for everyone who fills in forms, and help for the individual and dwelling
forms all in the one document.
Print guide notes – Māori/English
The approach from 2006 is continued with the guide notes integrated into the design of
the Māori/English forms. At the back of these forms, both the Māori and English
version of the guide notes are provided in the same side-by-side format as the
questions to make them easy to compare. The messages and the dwelling form help
appear in the Māori/English dwelling form, while the Māori/English individual form
contains only the individual form help.
Online guide notes – English and Māori/English
The online guide notes are provided both as a link from the census homepage and
from within the online forms by clicking a help icon within the question. Clicking the
help link from within a form opens the guide notes in a new window that is
automatically scrolled to the appropriate place. A respondent accessing help from
within an online form will receive guide notes in the same language as the form being
completed.
A key difference between the print and online formats is that the online guide notes
contain significantly more content. Not being constrained by paper size, the online
guide notes are able to carry more content. Decisions about the content of the print
guides were made in consultation with affected parties and with consideration of
relative priorities.
2011 Census guide note changes
In an environment of minimal change to the forms, the guide notes became an
important tool for addressing known quality issues from 2006. The guide notes give
the opportunity to add explanation to those questions with known quality concerns
without the need to change the questions themselves. Where guide notes have been
changed, this has been done in consultation with affected stakeholders to ensure the
accuracy of what is communicated in the guide note. The guide note changes were
also tested during some phases of cognitive testing, though the majority of cognitive
test respondents made no use of the guide notes.
37
2011 Census Form Design Report
Communications changes
The communications content of the guide notes has been revised to:
• ensure consistency with other key messages for the 2011 Census
• make census confidentiality messaging consistent across Statistics NZ
• advise the public what happens to census forms once they have been filled in
• explain what happens after the census, in particular with the post-censal surveys
Statistics NZ conducts.
In a change from 2006, the 2011 Census guide notes carry a stand-alone message
emphasising that the census is compulsory.
Brief descriptions of guide note changes for 2011
Census
Each of the tables below very briefly describes the changes to the guide notes made
for the 2011 Census. They include information about if the change is a revision to a
2006 guide note or an addition to the guide notes offered in 2006. Also, because not
all changes were possible in the print version of the guide notes, due to the paper size,
the guide note versions affected are also noted.
Individual form guide note changes
Both formats,
print only, or
online only?
Question
Change type
Change descriptions
Name
Revision
Amended reference to post-censal
surveys
Both
Usual address
Revision
Update to reflect new standard,
particularly for children in shared care
Both
Usual address
Addition
Extra help for overseas tertiary
students
Online only
Years at usual
address
Addition
Extra help for respondents,
particularly those who have had
spells away from their usual
residence
Online only
Ethnicity
Revision
Additional reference to uses of
ethnicity data
Both
Māori descent
Revision
Minor change to guide note for
respondents of Cook Island Maori
descent
Both
Iwi listing
Revision
Iwi list updated following iwi
classification review and inclusion of
clarification that the listing is guide
only
Both
38
2011 Census Form Design Report
Both formats,
print only, or
online only?
Question
Change type
Change descriptions
Living
arrangements
Revision
Description of civil union revised with
more detail given to address
confusion between civil unions and
civil ceremonies
Both
Legally
registered
relationship
status
Revision
Updated to reflect new standard and
make consistent with note for Q19
Both
Sources of
income
Revision
Updated with clarification of how to
count new or emerging government
payments
Both
Total annual
income
Revision
Income calculation table updated to
reflect tax changes
Both
Hours worked
Addition
Guidance for respondents with
complex working patterns or
situations
Online only
Dwelling form guide note changes
Question
Change
type
Change descriptions
Both formats,
print only, or
online only?
Dwelling
membership
Revision
Updated for consistency with other
changes related to civil unions
Both
Dwelling
ownership
Revision
Further information about complex
ownership arrangements
Both
Mortgage
payments
Addition
Further advice for respondents
about complex mortgages
Print only
Mortgage
payments
Revision
Further advice for respondents
about complex mortgages
Online only
Motor vehicles
available for use
Addition
Information on motor vehicles to
count or not count included (online
only in 2006)
Print only
Motor vehicles
available for use
Revision
Existing online guide note updated
to include more information from the
standard
Online only
39
2011 Census Form Design Report
2011 Census online form development
Background
The 2006 Census was the first time that Statistics NZ offered respondents the option
to complete their forms online.
In 2006, the philosophy for the online form was to minimise the potential for mode
effects by making the user experience as similar as possible for both paper and online
forms. Features of the online form design intended to meet this aim include:
• scrolling, single page layout
• minimal use of error or warning messages to prompt the respondent to check or
correct answers
• limited automatic routing of respondents past questions that don’t need to be
answered.
The development of the 2006 online form included extensive usability testing and an
expert review by Don Dillman. The expert review praised the design of the online
forms and feedback following the 2006 Census showed that the forms were well
received by the public.
Though the uptake of the online forms in 2006 was low (7 percent), this was
attributable to the strategy of the time, which was to not specifically promote the online
forms. The online forms will be actively promoted in 2011 and, because of this, uptake
of the online form option is expected to increase significantly.
Features of the 2011 online census forms
2011 Census website and homepage
The 2011 Census homepage address has been simplified for 2011. The address,
which was www.stats.census2006.govt.nz, is now simply www.census.govt.nz.
During the census field collection period, users can access the online form either from
the homepage by clicking a ‘Census online’ menu bar or by clicking a ‘Complete your
forms online’ graphic. Using either of these methods, the respondent is taken straight
to a logon page.
Census online logon page
In 2006, respondents had to navigate from the census homepage through two screens
before reaching the logon page, which contained a language selection screen and an
information screen. Reducing the number of screens a respondent must navigate was
a key objective of the 2011 Census online form project.
In 2011, respondents will be presented with a logon screen immediately after clicking
through from the homepage. Early prototypes presented respondents just with the
logon fields, however, respondent feedback led to the inclusion of a security message
before respondents entered any codes. Once the security messages were introduced
into the logon page, respondents were happy with the amount of information provided
before logging on.
40
2011 Census Form Design Report
Logging on
As in 2006, the process to log on to the online forms will require the respondent to
complete two fields to access the secure online forms.
1. First, the respondent must enter an 11 character ‘Internet ID’ from the top right
corner of a paper form delivered to their dwelling.
2. Second, the respondent enters a nine character ‘Internet Access Code’ from
the securely sealed envelope delivered to their dwelling.
Respondents who do not have paper forms can also have an Internet ID and Internet
Access Code emailed to them.
Both fields must be completed correctly before log on is allowed and, if unsuccessful,
respondents are presented an error message to let them know something is wrong.
The logon process did undergo some change from 2006. Most significantly, the
Internet Access Code was reduced from 12 characters to nine. Respondents
expressed no concerns that the nine-character access code was any less secure than
the 12-character one and commented that the nine-character code was more userfriendly.
Also, because of the reduced number of screens before log on, particular interest was
paid to how easily respondents could log on. In usability tests, respondents were
observed to have little difficulty either locating the information needed to log on or
actually entering the codes themselves. Dress rehearsal follow-up survey data
supported the findings of the usability testing with nearly all respondents stating the
logon process was either ‘very easy’ or ‘easy’.
Look and feel
The 2011 online forms maintain a look and feel as consistent as possible with the
paper forms. Colours and fonts are as close as possible to the paper forms and the
design of the question boxes generally matched the design of the paper forms.
The single page scrolling format of 2006 was retained as this too helped maintain the
look and feel of the paper forms. Respondents have a sense of progression with a
scrolling format and questions retain the context from surrounding questions.
Alternative formats were not investigated for 2011 but, as in 2006, respondents were
generally satisfied with the scrolling format.
Response formats
Standard online form design principles are used within the online forms. Radio buttons
are used for questions requiring a single response. Check boxes are used for
questions where multiple responses are allowed.
Where a text response is required, a free-text response field is provided that allows a
respondent to enter text up to a prescribed character limit. These character limits vary
from as few as two, to as many as 80 characters.
In cases where only a numeric response is required, the system prevents the
respondent from entering anything other than numeric characters.
Automated routing
As in 2006, a key feature of the 2011 forms is the minimising of respondent burden by
automatically routing respondents past questions that they don’t need to answer,
based on their answers to preceding questions.
41
2011 Census Form Design Report
The 2006 approach of alerting the respondent with a brief message and greying out
the affected questions is reused in the 2011 design.
In two places in the individual form, the questions that follow disappear rather than
appear as ‘greyed out’. This occurs after question 11 for respondents who do not
usually live in New Zealand and at question 20 for respondents who are aged under
15 years.
Online help
The online system allows a respondent to access help in two different ways. The first
method allows the respondent to access help from the census home page, without the
need to log on to the secure area. Respondents can click a ‘Help’ link in the menu bar
that takes them to a ‘Help’ landing page.
This method is also available to respondents once they have logged on to the secure
forms area. Respondents completing forms can also access help for specific questions
from within the form itself. Each question that has an associated guide note has a
clickable ‘Help’ link in the bottom right corner of the question field. This takes the
respondent directly to the help relevant to this question.
Error and warning messages
The online forms make limited use of edits and consistency checking as the
respondent completes the form. Edits and checks can further be classified as ‘errors’
or ‘warnings’.
‘Error’ edits and checks are those that must be corrected before a form can be
submitted. These have been limited to critical routing points and data defined as the
‘foremost variables’: name, sex, date of birth, usual address, census night address,
and ethnicity.
‘Warning’ edits and checks apply to those responses that are inconsistent with other
responses or outside expected bounds. These are limited to only two places in the
forms.
Internal question validity is also built in to prevent contradictory responses within
questions. For example, it is impossible for a respondent to state they have no source
of income and that they earn salary or wages.
42
2011 Census Form Design Report
Respondents are encouraged to check their forms with a summary panel that displays
the answers given to key questions earlier in the form. The respondent can check the
information and, if necessary, return to edit information by way of an ‘Edit’ hyperlink.
Māori/English online form features
Language selection
As in 2006, the online forms are available in both English and in te reo Māori.
The first time a respondent can access Māori language is on the logon page, using a
button in the navigation bar. This allows the respondent to log on directly to a Māori
form selection page. The language selection method has a low profile on this page,
but usability testing suggested there was not a high expectation among users to be
able to log on in Māori.
However, there is a high expectation among users to be able to complete their forms
in Māori. Therefore, once a respondent is logged on, it is important that the Māori
language option has a high profile. For that reason, every time a user logs on for a
new session, the form selection page carries a high profile message prompting the
respondent to change language if desired. The message also promotes the location of
the language toggle button that appears in the menu bar.
43
2011 Census Form Design Report
‘Within form’ translations
One key feature of the 2006 forms was the functionality that allowed users of a Māori
form to ‘hover over’ a translation button to view the corresponding English text. This
was intended to replicate the side-by-side easy comparability of the Māori/English
paper forms. Though popular with respondents, this approach was not compliant with
new government web standards that were developed after 2006.
The solution implemented for 2011 is to allow individual translation of a question while
the rest of the form remains in Māori. Translation is enabled by clicking an ‘English’
button that appears immediately above the question text in the top left corner of the
question pane. The location of the button is unchanged from 2006.
The button label has been changed from ‘Reo Ingarihi’ that was used in 2006 to
‘English’ for 2011. This button was mainly changed to reflect that many respondents
using Māori forms are beginner level and some may not immediately understand ‘Reo
Ingarihi’. Further, looking at translation features in other applications, it appears more
common for the button to be in the source language.
Māori language form
Māori language form – Q8 in English
While this solution doesn’t allow for the side-by-side comparison of the two languages,
it does preserve the form as a predominantly Māori language form and offers focused
language support when it is needed. Also importantly, it is compliant with government
standards.
Usability testing showed that respondents took some time to locate the translation
button but, once located, its use was intuitive. Since the button location was
unchanged from 2006 and no other logical place was obvious, the button remained.
Also, once respondents had resolved the issue that caused them to seek translation,
there was a variation in behaviour with regard to toggling back to the Māori language.
Some respondents were very diligent about toggling it back to Māori while others were
happy to leave it in English and carry on.
44