A Working Example of How to Use Artificial Intelligence To Automate

A Working Example of How to
Use Artificial Intelligence To
Automate and Transform
Surveys Into Customer Feedback
Conversations
ASC Conference
May 2017
Simon Neve
Wizu
Who we are:
• Fusion Software Ltd
• Sold IP of Mojo Surveys product to Microsoft
• Why are we here?
What are we going to talk about?
• Demonstration
• Lessons learnt
• Implications for future
Use case
• Real-time text analytics for open ended questions
Q. “How was your experience?”
A. “It woz rubbbish”
• Ask intelligent follow-up questions
Artificial Intelligence
A Working Example of How to Use Artificial
Intelligence To Automate and Transform Surveys Into
Customer Feedback Conversations
• Artificial Intelligence web services
• Spell checker
• NLP (Natural Language Processing) and Text Analytics to extract metadata
• Sentiment analysis
Artificial Intelligence – over hyped
• What is it?
• Enhancing with AI
Automation
A Working Example of How to Use Artificial
Intelligence To Automate and Transform Surveys Into
Customer Feedback Conversations
• Validate response for typos
• Create real-time intelligent prompts (or not) based on understanding
of answer given
• Ask different questions based on sentiment
Outcomes
A Working Example of How to Use Artificial
Intelligence To Automate and Transform Surveys Into
Customer Feedback Conversations
Improve:
• respondent experience
• completion rates
• data quality
Use case - context
• VoC survey:
•
•
•
•
•
non anonymous
personalized conversations
Satisfaction rating
Positive and negative comments
Engagement and Respondent
experience important
• Chat UI
• Subject tree configured
Demonstration
(1/4)
• Spellcheck
“teh rooom waz rubb ish”
• Prompt for others
Demonstration
(2/4)
• Stop words
• Derogatory
content
• MS Tay!
Demonstration
(3/4)
• Subject Matching
“The Shower was broken”
“The room service was good”
“The assistance was helpful”
“The room was expensive”
Demonstration
(4/4)
• Feedback loops
• Sarcasm detection
Limitations
• Not relevant to all surveys
• Increased costs:
• setup
• context
• Ongoing training time
• Different respondents asked different questions
Lessons learnt – Architecture
• Asynchronous
• Collection of services
• Lost of control
• Languages
• Resilience
• Versioning / Data consistency
Lessons learnt – Respondent experience
• Improvement mechanism
• Emoji
• Careful repeating respondent content back – Tay!
Lessons learnt – Context, Context, Context
• Code-frame first
• More contextual data means better interpretation
• In-survey context:
• oSAT question
• Positive comments / negative comments
• Customer Journey
Implications for Survey Industry
• Pricing
• Legal
• Jobs
• Security
Future AI
• Text
• Intent based
• Video
• Measure emotion
• Images
• Picture paints a thousand words
+ Machine learning
+ Deep learning
Contact details
Simon Neve
• [email protected]
• @ Simon_Neve
Wizu
• [email protected]
• wizu.com
• @iamwizu