19569-65599-1-SM - Saudi Medical Journal

Relationship between non-verbal gesture and speech in Saudi who stutter
Abdulaziz Almudhi
Department of Rehabilitation, College of Applied Medical Sciences, King Saud University,
Riyadh-11433, PO Box-10219, Saudi Arabia.
Corresponding author
Name: Dr. Abdulaziz Almudhi,
Designation: Assistant Professor,
Email id: [email protected]
1
Abstract
Objective: Non-Verbal gesture is a tool used in people who stutter to decrease the severity
of stuttering; also, means for listeners to understand the meaning of words that is
attributed to help in conveying ideas, codifying words and grammatical devices with
meanings. The aim of the present study was to perform the non-verbal gesture in
developmental stuttering with Saudi subjects.
Participants and methods: In this cross-sectional study, 20 subjects were selected and
among them 10 were using SHS approach and 10 were controls.
Results: The results of this study confirm the improvement of non-verbal gesture in the SHS
subjects only in the 15 sessions and from 16-20 sessions, there was no improvement.
Conclusion: The current results confirm the non-verbal gesture tool can be implemented in
the stutter subjects in the Saudi population.
Keywords: Non-verbal gestures, stuttering, Saudi therapy.
2
Introduction
Non-verbal gesture aids listeners to understand the meaning of words that is attributed to
help in conveying ideas, meaning, codifying words and grammatical device1. During
embryonic stages human communication skills develops, with recent literature emphasizing
the importance of non-verbal communication skills (ex: gestures), versus verbal
communication skills (ex: language)2. Guitar et al3 ] has identified as start bilabial gestures as
people who stutters with the abnormal lip muscle activity. However, the non-stuttering
speakers were consistently found to be activate depressor anguli oris muscles prior to
depressor labii inferioris for bilabial consonantal gesture while those who stuttered
frequently reversed the sequence. Non-verbal gesture may serve the multiple functions
such as facial expression, which plays a major role in the effective experience by modulating
vascular blood flow4. However, the understanding of information depends upon whether
the non-verbal gesture and speech convey the same information. Although this process is
faster, it may generate a few errors5, 6. However, poor oral motor control may constrain
early speech production, cognitive and linguistic skills may act as catalysts to affect the rate
of speech development7. The relation between non-verbal gesture and speech therapy has
been documented8. The presence of the non-verbal gesture can help in the elimination of
speech ambiguity9. This means that non-verbal gesture not only helps eliminate ambiguity
but the non-verbal gesture also has meaning10. According to Capone et al11, non-verbal
gestures can do more than that and aid in supporting the performance of more complex
activities such as language comprehension. There are no studies have been implemented in
the Saudi population and the aim of the present study was to perform the non-verbal
gesture in developmental stuttering with Saudi subjects.
Materials and methods
Ethical approval. The ethical approval was granted from College of Applied Medical Sciences
from King Saud University (CAMS 018-37/38).
Subjects. This is a pilot study recruited the clients from College of Applied Medical Sciences,
King Saud University, capital city of Saudi Arabia between January 2016 to May 2016 in the
hospital premises. In this study, 10 speech-hand synchronization (SHS) subjects and 10
control subjects were recruited to implement this study. SHS subjects were selected based
on the inclusion criteria as described in the prior publication 8. The selected participants for
3
SHS group (n=10) carried out the treatment sessions for 50 minutes per session for 10
weeks (2 sessions/week) described in Table 1. The participants were treated in a clinic
environment on a one-to-one basis with a clinician (2 sessions per week in the first 6 weeks
and one session per week thereafter). Each participant undergoes 50 minutes per session
for 16 sessions/times, including increasing fluent speech by adopting and applying gestures
(example; HM synchronized with PS). The control subjects (n=10) were recruited from
general population without stuttering. All the Saudi nationality subjects were included to
perform this pilot study.
Results
In this study, 10 SHS clients and 10 random matching controls were included. Each client has
to go with 20 sessions performed by the speech therapist using speech hand and
synchronization approach. The 20 sessions are divided into the following pattern. First 2
sessions are designed as sound session for 5 seconds. Third and fourth sessions are designed
as nonsense word for 5 seconds, fifth and sixth sessions were divided as 1 syllabic words per
5 seconds. In 6th and 7th sessions, 8 seconds were used for 2 syllabic words. Between 9-11
sessions, 9 seconds.
Table 1: List of sessions performed in SHS clients in this study.
Session Sound/Syllable
Time
Using hand
Synchro
1
1 Sound
5 sec


2
1 Sound
5 sec


3
None sense word
5 sec


4
None sense word
5 sec


5
1 Syllabic word
5 sec


6
1 Syllabic word
5 sec


7
2 Syllabic word
8 sec


8
2 Syllabic word
8 sec


9
3 Syllabic word
9 sec


4
10
3 Syllabic word
9 sec


11
3 Syllabic word
9 sec


12
2 words (Max 6 syllables)
1 syllable= 2 sec


13
2 words (Max 6 syllables)
1 syllable= 2 sec


14
3 words (Max 9 syllables)
1 syllable= 2 sec


15
3 words (Max 9 syllables)
1 syllable= 2 sec


16
3 words (Max 9 syllables)
1 syllable= 2 sec
×
×
17
Sentence
1 syllable= 2 sec
×
×
18
Sentence
1 syllable= 2 sec
×
×
19
Sentence
1 syllable= 1 sec
×
×
20
Sentence
1 syllable= 1 sec
×
×
were used for 2 syllabic words. In 12th and 13th sessions, 2 words were used with maximum
6 syllables and each syllable is used for 2 seconds and with the same pattern were used in
14-16 sessions with 3 words with 9 syllables were maximum. In between 17and 18 sessions,
sentence was used as 1 syllable per 2 seconds and same pattern follows in 19-20 sessions
with 1 second/syllable (Table 1). The results of our study confirm the success in all the 10
clients only till first 15 sessions with hands and synchronization. However, from 16-20
sessions all clients could not form sentence formation using hands and synchronization.
Discussion
The aim of the current study was to perform non-verbal gesture in developmental stuttering
with Saudi subjects. The result of the current study indicates the positive response from the
Saudi subjects only in initial 15 sessions implemented with both hands and synchronization.
Sentence formation was failed in last 5 sessions (16-20) due to the stopping the usage of
hands of clients suggested by the speech therapist. All the control subjects perform
exceptionally well in complete 20 sessions and it could be possible due to the non-stuttering
in control subjects.
An earlier review by Aldunate et al12has confirmed various research has shown that
emoticons contribute to a greater social presence as a result of the enrichment of text5
based communication channels, moreover, emoticons also constitute valuable resource for
language comprehension by providing expressivity to text messages. There are several
empirical studies, conducted on non-verbal gestures and speech production have
emphasized non-verbal gesture and have shown the importance of the integration between
speech production and hand movement. Moreover, non-verbal gesture communication
studies have rapidly developed over the past decades and many studies have been
conducted in order to better understand how the non-verbal gesture is coordinated with
spoken language13,
14.
Speech may be conceptualized as primarily gestural since there is
continuity between manual and verbal language. This argument is also supported by the
motor theory of speech15 and later articulatory phonology (AP)16 and action theory17. AP
asserts that speech can be explained as a system that produces sounds but also produces
articulated gestures. This can be conceptualized as coordinated action involving six
articulated organs such as lips, velum, larynx and blade, body and root of the tongue. This
method is primarily and significantly based on the basic units of speech such as phonemes
which exist as continuous units within the acoustic signal18. They are also not discreetly
discernible in mechanical recordings of sound, as in a sound spectrograph 15. Non-verbal
gestures can contribute to change the knowledge through the non-verbal gesture’s
cognitive effects1. Just as externalized thought might save the cognitive effort for other
uses19, gesturing might be seen as ‘externalizing a speaker’s thoughts onto the body’.
Further, Ping et al20 found that gesturing during speaking lightens a speaker’s cognitive
load21, 22. As stated earlier, this suggests that non-verbal gestures can directly match the
speaker message which lightens his/her cognitive load, affecting the receiver’s cognitive
state.
6
Rizzolatti et al23 claim that language conventionally comes from a brain in which language
processing and hand movement share neurological connections, whilst Gentilucci et al 24
argue that speech evolves as a transition onwards from manual gesture. Graziano et al25
found that stimulation of the primary motor cortex of monkeys resulted in mouth opening
and clenching fingers as well as movement within the mouth. Gallese et al26 noted that
Broca’s area in the brain is activated by non-linguistic hand movements. This suggests that
speech production is activated too by areas responsible for processing spoken language.
Understanding utterances requires the simultaneous integration of both modalities in the
brain. This is evidence produced neurologically27.
Furthermore, functional brain imaging studies indicated that symbolic non-verbal gestures
and spoken words are both processed by a common network of inferior frontal and
posterior temporal regions of the left hemisphere28 and that sign language activates Broca’s
area in the left hemisphere29, 30. These results suggest that these areas, rather than being
restricted to speech processing, have a modality-independent role in linking meaning with
symbols.
The strength of the current study was to opt the Saudi subjects and this research has been
carried out in RHS department in CAMS, KSU and implementation of short sessions. The
limitations found in this study are low sample size.
Conclusion
These studies have shown the importance of the integration between speech production
and hand movement. In addition, there has been an increase in non-verbal gesture
communication studies over the past decades, and many studies have been conducted in
order to better understand how the non-verbal gesture is coordinated with spoken
language. In the communication field, non-verbal gestures and speech appeared to be
7
strongly connected not only to provide the listener with additional information but also to
help the speaker to convey his/her message clearly. We can see that non-verbal gestures
can play the role of facilitator in language comprehension, expression and acquisition.
Therefore, we have considered the relationship between the non-verbal gesture and speech
due to the fact that non-verbal gesture and speech are correlated developmentally,
neurologically and behaviorally and the evidence of their relationship is a significant one.
Conflict of Interest: There is no conflict of Interest towards this study.
Acknowledgement: The author extends his sincere appreciation to College of Applied
Medical Sciences Research Centre and the Deanship of Scientific Research at King Saud
University.
References
1.
Goldin‐Meadow S. How gesture promotes learning throughout childhood. Child development
perspectives. 2009; 3: 106-11.
2.
Kawai E, Takagai S, Takei N, et al. Maternal postpartum depressive symptoms predict delay
in non-verbal communication in 14-month-old infants. Infant Behavior and Development. 2017; 46:
33-45.
3.
Guitar B, Guitar C, Neilson P, O'Dwyer N and Andrews G. Onset sequencing of selected lip
muscles in stutterers and nonstutterers. Journal of speech and hearing research. 1988; 31: 28.
4.
Tomkins SS and McCarter R. What and where are the primary affects? Some evidence for a
theory. Perceptual and motor skills. 1964; 18: 119-58.
5.
Kelly SD, Özyürek A and Maris E. Two sides of the same coin speech and gesture mutually
interact to enhance comprehension. Psychological Science. 2009.
6.
Tellier M. The development of gesture. Routledge, 2009.
7.
Nip IS, Green JR and Marx DB. Early speech motor development: Cognitive and linguistic
considerations. Journal of communication disorders. 2009; 42: 286-98.
8.
Almudhi A. Innovation of Speech Hand Synchronization as a Treatment in Adults who
Stutter. Journal of Speech Pathology & Therapy. 2016; 1: 1-4.
9.
Kok K, Bergmann K, Cienki A and Kopp S. Mapping out the multifunctionality of speakers’
gestures. Gesture. 2016; 15: 37-59.
10.
Eggenberger N, Preisig BC, Schumacher R, et al. Comprehension of Co-Speech Gestures in
Aphasic Patients: An Eye Movement Study. PloS one. 2016; 11: e0146583.
11.
Capone NC and McGregor KK. Gesture development: a review for clinical and research
practices. Journal of speech, language, and hearing research : JSLHR. 2004; 47: 173-86.
12.
Aldunate N and González-Ibáñez R. An Integrated Review of Emoticons in ComputerMediated Communication. Frontiers in Psychology. 2017; 7: 2061.
13.
Language and Gesture. Cambridge: Cambridge University Press, 2000.
8
14.
Wray C, Saunders N, McGuire R, Cousins G and Norbury C. Gesture production in language
impairment: It’s quality not quantity that matters. Journal of Speech, Language, and Hearing
Research. 2016.
15.
Liberman AM, Cooper FS, Shankweiler DP and Studdert-Kennedy M. Perception of the
speech code. Psychological review. 1967; 74: 431-61.
16.
Browman CP and Goldstein L. Dynamics and articulatory phonology. 1995.
17.
Ward D. Intrinsic and extrinsic timing in stutterers' speech: data and implications. Language
and speech. 1997; 40 ( Pt 3): 289-310.
18.
Joos M. Acoustic phonetics. Language. 1948; 24: 5-136.
19.
Clark DM. Anxiety disorders: Why they persist and how to treat them. Behaviour research
and therapy. 1999; 37: S5-S27.
20.
Ping R and Goldin‐Meadow S. Gesturing saves cognitive resources when talking about
nonpresent objects. Cognitive Science. 2010; 34: 602-19.
21.
Goldin-Meadow S, Nusbaum H, Kelly SD and Wagner S. Explaining math: Gesturing
lightens the load. Psychological Science. 2001; 12: 516-22.
22.
Wagner SM, Nusbaum H and Goldin-Meadow S. Probing the mental representation of
gesture: Is handwaving spatial? Journal of Memory and Language. 2004; 50: 395-407.
23.
Rizzolatti G and Arbib MA. Language within our grasp. Trends in neurosciences. 1998; 21:
188-94.
24.
Gentilucci M and Corballis MC. From manual gesture to speech: a gradual transition.
Neuroscience & Biobehavioral Reviews. 2006; 30: 949-60.
25.
Graziano MS, Taylor CS and Moore T. Complex movements evoked by microstimulation of
precentral cortex. Neuron. 2002; 34: 841-51.
26.
GALLESE V, FADIGA L, FOGASSI L and RIZZOLATTI G. Action recognition in the
premotor cortex. Brain. 2009; 132: 1685-9.
27.
Hickok G. The functional neuroanatomy of language. Physics of life reviews. 2009; 6: 121-43.
28.
Xu J, Gannon PJ, Emmorey K, Smith JF and Braun AR. Symbolic gestures and spoken
language are processed by a common neural system. Proceedings of the National Academy of
Sciences of the United States of America. 2009; 106: 20664-9.
29.
Corina DP, San Jose-Robertson L, Guillemin A, High J and Braun AR. Language
lateralization in a bimanual language. Journal of cognitive neuroscience. 2003; 15: 718-30.
30.
Emmorey K, Mehta S and Grabowski TJ. The neural correlates of sign versus word
production. NeuroImage. 2007; 36: 202-8.
9