Moderator Assistant - The University of Sydney

Moderator Assistant: helping those who help
via online mental health support groups
Ming Liu, Rafael A. Calvo
School of Electrical and Information
Engineering
University of Sydney, Australia and
Young and Well CRC
[email protected]
[email protected]
Tracey Davenport, Ian Hickie
Brain and Mind Research Institute
University of Sydney
[email protected]
[email protected]
guidance and hope to other people facing similar
problems. They also help to improve the ‘helper’s’ selfesteem and reduce self-stigma (Corrigan, 2006). Peerto-peer support can be considered a form of mental
health intervention used independently or bundled with
other forms of intervention but is often not moderated.
Other online communities provide more structured
support via organizations such as the Inspire Foundation
Australia (http://inspire.org.au/). These organizations
provide services through websites such as
ReachOut.com where there is help amongst peers yet
the community is supported by professionals. In these
websites and online communities, young people can
seek and receive help from trained staff, and use
professionally developed resources.
ABSTRACT
Helping participants of online communities thrive,
support their pro-social behaviours and duty-of-care are
challenging tasks. This is particularly difficult in the
online peer support groups that are becoming
increasingly popular on social networks like Facebook
or organizations like ReachOut.com in Australia. In this
paper, we present a novel system called Moderator
Assistant, that uses natural language processing
techniques to provide automated support for multiple
online support groups. The system generates automated
interventions based on key-terms and concepts
extracted from the text posted by participants. The
human moderator can select and edit these interventions
before sending to a participant. The system implements
behaviour analysis features to measure the impact of the
interventions.
In both scenarios, particularly the latter, moderators
must spend a significant amount of time providing
written feedback. Maintaining the quality of feedback
and complying with duty of care is challenging even
within small communities, but when the community
grows their support might become unsustainable.
Author Keywords
Mental health, online intervention, natural language
processing.
ACM Classification Keywords
Furthermore, staff rotation requires training and makes
hard to follow quality processes and protocols.
Moderators provide feedback and generally do not
know what the impact is. The feedback provided can
differ not only on the content itself but even in the
writing style and tone of voice.
H5.m. Information interfaces and presentation (e.g.,
HCI): Miscellaneous.
INTRODUCTION
Large online health support groups provide an
increasingly important type of support to people with
mental health problems (Christensen, et al.,
2009)(Webb, Burns, & Collin, 2008). Thousands of
people go to public social networking websites such as
Facebook and LiveJournal (http:// livejournal.com/)
seeking help, but generally find very few trained people
providing
professional
feedback.
Peer-to-peer
communities and self-support groups are amongst the
most promising forms of e-health (Eysenbach, et al.,
2004). Peer-to-peer support (Davidson, et al., 2006) is
based on the assumption that people who have
overcome difficulty can provide valuable support,
Some studies (Gilat, et al., 2011) have been conducted
to investigate moderators’ response strategies and
examine the relationship between the messages and
types of feedback. Gilat et al. (Gilat, Tobin and Shahar,
2011) divided the response strategies for suicidal
messages into three categories: emotion-focused,
cognitive-focused and behaviour-focused. The emotionfocused responses try to create a personal emotional
bond with the distressed individual. For example,
“That's an understandable thing, Lucy!” shows
understanding. In addition, this type of response often
invites the individual to join the group,for example,
‘‘I invite you to join us. We are here to support,
encourage and help.’’. The cognitive-focused responses
aim to alter and broaden the narrow perspective of
individuals, for example, “Look at what you write to
others; you know how to think positively.’’ The last
category is behavior-focused responses, which contain
OzCHI 2013 Workshops Programme, Nov 25 & 26, Melbourne,
Australia.
Copyright in material reproduced here remains with the author(s), who
have granted CHISIG a licence to distribute the material as part of the
OzCHI 2013 electronic proceedings. For any further use please contact
the author(s).
1
behavioural components presenting suggestions or
recommendations to individuals, for example, “Hi
James. Do you think that you could call MensLine
Australia for a chat? They have professional
counsellors, experienced in family and relationship
issues and can refer you to local services and support
programs.” These studies would provide fundamental
knowledge to design response templates.
Yet, it is not easy to quantify which one is more
conducive to behaviour change and improvements. One
way in which feedback can be differentiated is by
making it autonomy supportive or not (i.e. directive)
(Ryan & Deci, 2000). For example, when a user shows
signs of a serious depressive episode and indicates they
are considering self-harm, a moderator may tell a user
“You must visit this website for information and read
the case studies”, a directive approach, or “Why don’t
you look into this website for information that might
help you. The case studies might help you learn how
others have dealt with this type of problem”, a much
more autonomy supportive message. Evaluating the
long-term impact of such interventions is a new area of
research currently labeled ‘positive computing’ (Calvo
& Peters, 2014).
•
•
“Make the system tangible”. The system should
keep a record of individual activities within the
online support group, which can then be used to
assess appropriateness of intervention.
Figure 3a shows the templating system used to generate
the test for the interventions and Figure 3b shows a
visualization of a component that aims to measure the
impact of the intervention (e.g. did the person access the
link provided).
Three main use cases have been considered so far:
1. System downloads posts and creates lists in real time.
These can be obtained from multiple online support
groups, such as Facebook, Twitter and ReachOut.com
using their APIs. Moderators can take actions on them
based on their preferences, such as a potential risk level
or time of post. Moderators can prioritise responding a
post with a high-risk level. A natural language
processing component is used to extract key elements
(e.g. username or key-phrases) from each post and
classify each one as a predefined category, such as
depression, self-harm, distorted thoughts or positive
behaviour. Each category has been given a certain risk
level by the moderator. Figure 2a illustrates the page
which lists predefined categories with risk levels. Each
category contains key words and syntactic patterns that
are considered as system knowledge base.
Our system design principles were based on guidelines
proposed by HCI researchers Doherty et al. (2010) and
Calvo & Peters (2013, 2014). The specific guidelines
applied to our system included:
“Do not place burdensome time demands on
moderators”. The system should help moderators
save time when responding to individual posts and
thus allow them to deal with more posts.
“Make the system adaptable and sustainable”. The
system intervention should apply to a broad range
of mental disorders of varying degrees of severity,
such as depression, anxiety and suicide.
Figure 2a shows posts the system has identified as
needing a response either because it represents an
expression of risky behaviour (e.g. substance abuse or
self-harm) or a positive behaviour that moderators want
to positively reinforce (e.g. healthy activities). Figure
2b shows two automatically generated interventions for
that post.
DESIGN PRINCIPLES
•
•
Moderator Assistant is a web-application that uses
natural language processing techniques to extract key
information that can be used to automatically generate
online interventions. Figure 1 shows the basic
architecture. Text from online forms and social
networks is downloaded and processed in real-time by
our behavioural analytics system called Tracer (Liu,
Calvo, & Pardo, 2013). A component (aka EPM) within
tracer extracts key terms and expressions managed by a
knowledge-base being built in collaboration with
mental health professionals.
The aim of this project is to develop a system
framework, called Moderator Assistant, to help
moderators to easily monitor one or more online
support communities, quickly produce interventions
automatically generated by the system, and analyze
individual behaviors in that online group.
“Build on existing skills, experience, working
methods and values of moderators”. Provision of
intervention should remain easy to use and similar
to existing online group discussion forum..
Work on both positive and negative cognitive,
affective and behavioural expressions.
SYSTEM DESCRIPTION
Recent human-computer interaction (HCI) research on
mental health has explored online interventions such as
internet-based cognitive behavior therapy systems
(Christensen, Griffiths and Farrer, 2009), relational
agent (Bickmore and Gruber, 2010), virtual reality
(Coylea, et al., 2007) and game-based Internet
interventions (Coyle, et al., 2011). Researchers
(Doherty, et al., 2012) have focused on defining
guidelines and strategies for such systems in order to
improve usability and user engagement. The system
presented in this paper has implemented some of these
guidelines.
•
•
2. Moderators choose or modify one automated
intervention which will be sent to the individual as a
comment
to
the
post
through
APIs.
“Consider the responsibilities placed on
moderators”.
Besides
giving
intervention,
moderators should have no extra work.
2
Figure 1: Architecture of Moderator Assistant
'
Figure 2: System lists posts obtained from multiple online groups for a moderator to take an action (Left).
!"#$%&'()*'+,-&%).,%'/0,,1&1')2'
Automatically generated interventions (Right)
)$.,3).&-'"2.&%4&2.",2'.,'5&'1&2.'
'
!
!
!
"#$%&'!()!*+,#-#,%./!0'1.-#2&3!
4#3%./#5.6#2+!
Figure 3: Intervention template (left), Visualizations (right)
!
3
Figure 2b shows the intervention page where a
moderator can adapt an automatically generated
intervention, rate the quality of each intervention and
select the preferable one to be sent. Adaptability has
been considered as an important requirement for new
technologies in the mental health care area (Coyle and
Doherty, 2009). Figure 3a shows the page where
moderators can define or update an intervention
template based on different theoretical approaches.
Each template contains some specific elements which
are extracted from a post, such as poster name, time and
post category.
Christensen, H., Griffiths, K. M. and Farrer, L.
Adherence in Internet interventions for anxiety
and depression: Systematic review. Journal of
Medical Internet Research, 11, 2 (2009).
Corrigan, P. W. Impact of consumer-operated services
on empowerment and recovery of people with
psychiatric disabilities. Psychiatric Services,
57, 10 (2006), 1493-1496.
Coyle, D. and Doherty, G. Clinical evaluations and
collaborative
design:developing
new
technologies
for
mental
healthcare
interventions. In Proc. CHI 2009, ACM Press
(2009), 2051-2060.
Coyle, D., McGlade, N., Doherty, G. and O'Reilly, G.
Exploratory evaluations of a computer game
supporting cognitive behavioural therapy for
adolescents. In Proc. CHI 2011, ACM Press
(2011), 2937-2946.
Coyle, D., Doherty, G., Matthews, M. and Sharry, J.
Computers in talk-based mental health
interventions. Interacting with Computers, 19,
4 (2007), 545-562.
Davidson, L., Chinman, M., Sells, D. and Rowe, M.
Peer support among adults with serious mental
illness: a report from the field. Schizophrenia
Bullletin, 32, 3 (2006), 443-450.
Doherty, G., Coyle, D. and Matthews, M. Design and
evaluation guidelines for mental health
technologies. Interacting with Computers, 22,
4 (2010), 243-252.
Doherty, G., Coyle, D. and Sharry, J. Engagement with
Online Mental Health Interventions: An
Exploratory Clinical Study of a Treatment for
Depression. In Proc. CHI 2012, ACM Press
(2012).
Eysenbach, G., Powell, J., Englesakis, M., Rizo, C. and
Stern, A. Health related virtual communities
and electronic support groups: systematic
review of the effects of online peer to peer
interactions. British Medical Journal, 328,
7449 (2004).
Gilat, I., Tobin, Y. and Shahar, G. Offering support to
suicidal individuals in an online support group.
Archives of Suicide Research, 15, 3 (2011),
195-206.
3. Moderator visualizes group members’ behaviour
including the posting message, receiving an
intervention and responding to an intervention. Figure
3b shows visualization for a moderator to monitor
individual behaviours in an online support group. In this
visualization, each row represents a person and each
data point represents an event. We use different colour
and shape to distinguish different events. In this case, a
blue circle represents a post event, and an orange star
means an intervention received event while a brown
cube means a responding intervention event. This helps
moderators to keep track of individual behaviours and
use them for discussion throughout the counselling
process.
CONCLUSION AND FUTURE WORK
In this paper, a smart system is described which can
help human moderators to easily moderate online
support groups. Three key use cases have been
described to highlight the system’s key features, such as
multiple online groups integration, automated
intervention generation and individual behavior analysis.
The natural language processing component is the core
of the system. Correctly classifying each post into a
category is crucial and the categories predefined should
be more general in mental health related issues, such as
depression, anxiety and suicide.
We are about to begin a pilot project to evaluate the
quality of the interventions generated. Our future work
will focus on establishing a general taxonomy for
addressing mental health related issues, further
developing the NLP component, and evaluating the
system performance and usability.
Liu, M., Calvo, R., & Pardo, A. (2013). Tracer: A tool
to measure student engagement in writing
activities. In 13th IEEE International Conference
on Advanced Learning Technologies. Beijing,
China: IEEE.
Acknowledgement
The project is supported by the Young and Well CRC
REFERENCES
Ryan, R. M., & Deci, E. L. (2000). Self-Determination
Theory and the Facilitation of Intrinsic
Motivation, Social Development, and WellBeing. American Psychologist, 55(1), 68–78.
Bickmore, T. and Gruber, A. Relational agents in
clinical
psychiatry.
Harvard
Review
Psychiatry, 8, 2 (2010).
Calvo, R., & Peters, D. (2014). Positive Computing (p.
to appear). Cambridge, MA: MIT Press.
Calvo,
Webb, M., Burns, J., & Collin, P. (2008). Providing
online support for young people with mental
health difficulties: challenges and opportunities
explored. Early intervention in psychiatry, 2(2),
108–13. doi:10.1111/j.1751-7893.2008.00066.x
R., & Peters, D. (2013). Promoting
psychological wellbeing: loftier goals for new
technologies. IEEE Technology and Society,
32(4).
4