Surveys guide

Guidance on running your own student survey
Why survey students?
It is vitally important that UCL understands the experiences of our students. Through student
surveys we are able to collect data about these experiences and identify any areas of
concern, as well as to make comparisons with other institutions in the sector where possible.
The results of some of these surveys feed into quality assurance and governance
processes, and inform UCL policy at the highest level.
Thinking of running a student survey?
UCL is actively trying to reduce the number of surveys students are asked to take during
their time here. While we need to collect data, students understandably become frustrated if
they are asked to complete surveys too often. Questions are often poorly designed, with
poorly-worded questions, which are unlikely to produce good quality information. Additionally
they often produce results which are invisible to anyone except the initiators of the survey.
Survey principles
Read UCL’s survey policy here.
No institution-wide surveys should take place at the same time as any of the NSS, New to
UCL survey, UCL Student Experience Survey, PRES and PTES surveys. The NSS runs
from early January to 30 April, the UCL Student Experience Survey runs from late March to
late April (pilot year 2017), the PRES during March and April, PTES from early April to early
June, and New to UCL from late October to 30 November.
If you are thinking of running a survey, you should first consider the alternatives.



Is the data already being collected elsewhere?
We run several institutional surveys, achieving good response rates and thousands of
responses, one of which might have asked the question you need to know. To see what
is already asked in our institutional student surveys go here.
Could you collect the results in a different way?
For instance, through web analytics, a focus group, exit polls from specific events,
surveying students as they wait in line for a particular service, online polls via Moodle, or
Electronic Voting Systems (EVS) which can be used in lectures and events.
Could you request additional questions to be added to an existing survey?
UCL tries to keep surveys as short as possible but there is often scope to add additional
questions to our surveys. Get in touch if this is something you would like to do.
Running a survey
If you have decided to go ahead with running a survey, here are some things to consider:
Survey software
There are hundreds of options for software for running surveys on. Here are some of the
most popular:
Survey Monkey
What can it do?
Survey Monkey is a popular survey online cloud-based survey tool with various question
types, skip logic and page branding available. It is compatible with mobile, web and tablet
devices.
Cost: Survey Monkey charge based on expected responses and complexity of the survey. A
basic survey with 10 questions and 100 responses would cost around £65.
Find out more: https://www.surveymonkey.co.uk/
Opinio
What can it do?
Opinio is a web-based survey tool which provides a framework for authoring and distributing
surveys as well as a range of reporting facilities.
Cost: Free for UCL staff and postgraduate students under UCL’s institutional license
Find out more: https://www.ucl.ac.uk/isd/services/learning-teaching/elearning-staff/coretools/opinio
Qualtrics
What can it do?
Qualtrics is a cloud-based survey tool that makes it quick and easy to design your own
surveys. It's useful for creating research surveys and general purpose surveys. Qualtrics
provides more sophisticated survey functions, analysis and cross-tabulation of data than
many other tools. Qualtrics was developed with higher education in mind and many other
HEIs use it.
Cost: On a quote-basis. You can either pay for an annual subscription or per-response.
Find out more: https://www.qualtrics.com/
Survey Gizmo
What does it do?
Winner of several industry awards, Survey Gizmo is a sophisticated market research tool. It
is more expensive than many of the alternatives (a standard annual license for one user
costs around £1000).
Find out more: https://www.surveygizmo.eu/
Designing your survey
The basic principles for designing a survey are to keep it short, consistent and clear.
According to YouGov, ‘respondent engagement with an online survey is dictated by subject
matter, visual appeal, question flow, page load speed and survey length.’ The chart below
shows the relationship between survey length and dropout rate. The relationship is not
linear, but there is a point (16 minutes) after which the dropout rate accelerates rapidly. After
a certain point, respondents get fatigued, pay less attention to the task in hand and increase
their speed of response.
Source: YouGov
Questions should be written in a consistent tense, be worded clearly and simply, without
jargon or acronyms. If you can, you should check the wording and time taken to complete
with a potential respondent. It may seem obvious, but you should think about what you need
to know from the results. Do you want to know how satisfied students were with a service, or
about their experience of using it? Will the respondents know what you mean from the
wording?
You should consider which response options would be most appropriate for your survey.
This could be a Likert scale (four or five-point scale from strongly agree to strongly disagree
for example), a multiple-choice question, a rank question or open text responses. Open text
responses can give you vital insight into the reasons behind the quantitative answers but are
tricky to analyse and too many of these can be difficult to process and off-putting for the
respondent.
Response rates
The basic rule for response rates is the more the better. The higher than response rate (%
respondents of total population) the more certain you can be that the results you have are
representative of the whole population. This is not linear – the margin of error decreases as
the population size grows. For example, 20% of a population of 60 students has a much
higher margin of error than 20% of a population of 10,000.
Source: Survey Monkey
As a general rule, a response rate of 15% to an online survey is considered low, 25%
average, 35% good and 45% excellent.
How to increase survey response rates
More responses always equal more reliable data. Research by HEFCE shows that students
are most likely to respond to a survey if asked to by someone they know (a lecturer, peer,
administrator, etc.). If your group is small, you could ask them to do it in a lecture or seminar.
If not, email is your best tool – but should be used carefully. Around 50% of our responses to
survey come in the 24 hours after sending a reminder email, but too many emails annoy
respondents. For a survey open for five weeks, we sent out an initial invitation and two
reminders. If you are controlling access to your survey, try not to email respondents who
have already taken it.



Personalising the communications you send out can be very effective – for instance
through mail merge or a mailing tool such as Mail Chimp or DotMailer. You can tailor the
sender, recipient name and details within the email to make it more personal to the
respondent. For example, “Dear Laura” is better than “Dear student”, and an email from
someone the recipient knows is more likely to be opened than an email from a generic
mailing list.
You could also put up posters – although research and our experience suggests this has
limited effect – or produce flyers for student common areas. More effective (and less
costly) promotion can be done through the student e-newsletter myUCL, or by asking
colleagues to promote your survey to the students they come into contact with.
Incentives such as a prize draw are a popular and effective way to increase response
rates. Our research shows that perception of a higher chance of winning is more
motivating than a single prize – so offer 50 £10 vouchers rather than a single £500
voucher. Guaranteed incentives such as free printing credit or vouchers for free coffee
for every respondent have been met with mixed success. Bear in mind that a high value
prize draw could tempt respondents to take your survey twice if you have an open link.
Survey access control
You have several options open to you for controlling access to your survey.
Open access
The simplest is an open link where anyone can take your survey. You may need to add
demographic questions to your survey in this case, increasing its overall length. Drawbacks
include poorer data quality (respondents could take a survey twice, and anyone could find
and take your survey).
Open access with verifier
You could add a requirement that respondents respond with a verifier such as their UCL
email address or student number, which you could then use to ensure no duplicates, or to
match back to a master list and fill in their demographic information (such as department,
domicile or age). This reduces problems with data quality but can be time-consuming and
care should be taken with anonymising the data once cleaned.
Username and password
You can “preload” information about your survey respondents into many survey tools
allowing you to export a list of unique usernames and passwords for each respondent. You
can then use a mail merge or mailing tool (such as Mail Chimp or DotMailer) to send these
out. This ensures much higher data quality and can reduce the length of your survey if their
demographic information is already loaded, but can be tricky to administer and requires
having respondents’ information already available.
Unique link
As above, but the survey software generates a unique URL for each user. This can be more
effective than a username and password for each user as it reduces the effort they have to
put into taking your survey. Care should be taken not to inadvertently change the URL while
mailing out the links!
Contact us
Send an email to [email protected] or contact Rebecca Allen, Csenge Lantos or
Sally Mackenzie for more information.