David J Ostry, PhD
Professor
M.Sc. Industrial Engineering, University of Toronto, Toronto
Ph.D. Psychology, University of Toronto, Toronto
Tel.: +1-514-398-6111
Email: david.ostry@mcgill.ca
|
|
Journal
Articles
Van Vugt FT and Ostry DJ (2018) The Structure and Acquisition of Sensorimotor Maps. J Cognitive Neurosci. 30:3, 290-306.
Abstract PDF
One of the puzzles of learning to talk or play a musical instrument is how we learn which movement produces a
particular sound: an audiomotor map. Existing research has used mappings that are already well learned such as
controlling a cursor using a computer mouse. By contrast, the acquisition of novel sensorimotor maps was studied
by having participants learn arm movements to auditory targets. These sounds did not come from different directions but,
like speech, were only distinguished by their frequencies. It is shown that learning involves forming not one but two maps:
a point map connecting sensory targets with motor commands and an error map linking sensory errors to motor corrections.
Learning a point map is possible even when targets never repeat. Thus, although participants make errors, there is no
opportunity to correct them because the target is different on every trial, and therefore learning cannot be driven by
error correction. Furthermore, when the opportunity for error correction is provided, it is seen that acquiring error
correction is itself a learning process that changes over time and results in an error map. In principle, the error map
could be derived from the point map, but instead, these two maps are independently acquired and jointly enable sensorimotor
control and learning. A computational model shows that this dual encoding is optimal and simulations based on this
architecture predict that learning the two maps results in performance improvements comparable with those observed empirically
Sidarta A, Vahdat S, Bernardi NF, Ostry DJ (2016) Somatic and reinforcement-based plasticity in the initial
stages of human motor learning. J Neurosci. 36:11682-11692.
Abstract PDF
As one learns to dance or play tennis, the desired somatosensory state is typically unknown. Trial and error is important as motor
behavior is shaped by successful and unsuccessful movements. As an experimental model, we designed a task in which human participants
make reaching movements to a hidden target and receive positive reinforcement when successful. We identified somatic and
reinforcement-based sources of plasticity on the basis of changes in functional connectivity using resting-state fMRI before and after
learning. The neuroimaging data revealed reinforcement-related changes in both motor and somatosensory brain areas in which a
strengthening of connectivity was related to the amount of positive reinforcement during learning. Areas of prefrontal cortex were
similarly altered in relation to reinforcement, with connectivity between sensorimotor areas of putamen and the reward-related ventromedial
prefrontal cortex strengthened in relation to the amount of successful feedback received. In other analyses, we assessed connectivity
related to changes in movement direction between trials, a type of variability that presumably reflects exploratory strategies during
learning. We found that connectivity in a network linking motor and somatosensory cortices increased with trial-to-trial changes in
direction. Connectivity varied as well with the change in movement direction following incorrect movements. Here the changes were
observed in a somatic memory and decision making network involving ventrolateral prefrontal cortex and second somatosensory cortex.
Our results point to the idea that the initial stages of motor learning are not wholly motor but rather involve plasticity in somatic and
prefrontal networks related both to reward and exploration.
Lametti DR, Rochet-Capellan A, Neufeld E, Shiller DM, Ostry DJ (2014) Plasticity in the human speech motor system drives changes in speech perception. J Neurosci 34:10339-10346.
Abstract Article in PDF format (390 KB)
Recent studies of human speech motor learning suggest that learning is accompanied by changes in auditory perception. But what drives the perceptual change? Is it a consequence of changes in the motor system? Or is it a result of sensory inflow during learning? Here, subjects participated in a speech motor-learning task involving adaptation to altered auditory feedback and they were subsequently tested for perceptual change. In two separate experiments, involving two different auditory perceptual continua, we show that changes in the speech motor system that accompany learning drive changes in auditory speech perception. Specifically, we obtained changes in speech perception when adaptation to altered auditory feedback led to speech production that fell into the phonetic range of the speech perceptual tests. However, a similar change in perception was not observed when the auditory feedback that subjects' received during learning fell into the phonetic range of the perceptual tests. This indicates that the central motor outflow associated with vocal sensorimotor adaptation drives changes to the perceptual classification of speech sounds.
Lametti DR, Krol SA, Shiller DM, Ostry DJ (2014) Brief periods of auditory perceptual training can determine the sensory targets of speech motor learning. Psychol Sci. 25:1325-1336.
Abstract Article in PDF format (693 KB)
The perception of speech is notably malleable in adults, yet alterations in perception seem to have little impact on speech production. However, we hypothesized that speech perceptual training might immediately influence speech motor learning. To test this, we paired a speech perceptual-training task with a speech motor-learning task. Subjects performed a series of perceptual tests designed to measure and then manipulate the perceptual distinction between the words head and had. Subjects then produced head with the sound of the vowel altered in real time so that they heard themselves through headphones producing a word that sounded more like had. In support of our hypothesis, the amount of motor learning in response to the voice alterations depended on the perceptual boundary acquired through perceptual training. The studies show that plasticity in adults' speech perception can have immediate consequences for speech production in the context of speech learning.
Ito T, Johns AR, Ostry DJ (2014) Left lateralized enhancement of orofacial somatosensory processing due to speech sounds. J Speech Lang Hear Res. 56:1875-81.
Abstract Article in PDF format (199 KB)
PURPOSE:
Somatosensory information associated with speech articulatory movements affects the perception of speech sounds and vice versa, suggesting an intimate linkage between speech production and perception systems. However, it is unclear which cortical processes are involved in the interaction between speech sounds and orofacial somatosensory inputs. The authors examined whether speech sounds modify orofacial somatosensory cortical potentials that were elicited using facial skin perturbations.
METHOD:
Somatosensory event-related potentials in EEG were recorded in 3 background sound conditions (pink noise, speech sounds, and nonspeech sounds) and also in a silent condition. Facial skin deformations that are similar in timing and duration to those experienced in speech production were used for somatosensory stimulation.
RESULTS:
The authors found that speech sounds reliably enhanced the first negative peak of the somatosensory event-related potential when compared with the other 3 sound conditions. The enhancement was evident at electrode locations above the left motor and premotor area of the orofacial system. The result indicates that speech sounds interact with somatosensory cortical processes that are produced by speech-production-like patterns of facial skin stretch.
CONCLUSION:
Neural circuits in the left hemisphere, presumably in left motor and premotor cortex, may play a prominent role in the interaction between auditory inputs and speech-relevant somatosensory processing.
Vahdat S, Darainy M, Ostry DJ (2014) Structure of plasticity in human sensory and motor
networks due to perceptual learning. J Neurosci 34:2451-63.
Abstract -Article in PDF format (2.25 MB)
As we begin to acquire a new motor skill, we face the dual challenge of determining and
refining the somatosensory goals of our movements and establishing the best motor commands to
achieve our ends. The two typically proceed in parallel, and accordingly it is unclear how
much of skill acquisition is a reflection of changes in sensory systems and how much reflects
changes in the brain's motor areas. Here we have intentionally separated perceptual and motor
learning in time so that we can assess functional changes to human sensory and motor networks as
a result of perceptual learning. Our subjects underwent fMRI scans of the resting brain before
and after a somatosensory discrimination task. We identified changes in functional connectivity
that were due to the effects of perceptual learning on movement. For this purpose, we used a
neural model of the transmission of sensory signals from perceptual decision making through to
motor action. We used this model in combination with a partial correlation technique to parcel
out those changes in connectivity observed in motor systems that could be attributed to activity
in sensory brain regions. We found that, after removing effects that are linearly correlated with
somatosensory activity, perceptual learning results in changes to frontal motor areas that are
related to the effects of this training on motor behavior and learning. This suggests that
perceptual learning produces changes to frontal motor areas of the brain and may thus contribute
directly to motor learning.
Darainy M, Vahdat S, Ostry DJ (2013) Perceptual learning in sensorimotor adaptation. J Neurophysiol 110: 2152-2162.
Abstract -
Article in PDF format (935 KB)
Motor learning often involves situations in which the
somatosensory targets of movement are initially, poorly
defined, as for example, in learning to speak or learning
the feel of a proper tennis serve. Under these conditions,
motor skill acquisition presumably requires perceptual as well
as motor learning. That is, it engages both the progressive
shaping of sensory targets and associated changes in motor
performance. In the present paper, we test the idea that
perceptual learning alters somatosensory function and in
so doing produces changes to motor performance and sensorimotor
adaptation. Subjects in these experiments undergo perceptual
training in which a robotic device passively moves the arm on
one of a set of fan shaped trajectories. Subjects are required
to indicate whether the robot moved the limb to the right or the
left and feedback is provided. Over the course of training both
the perceptual boundary and acuity are altered. The perceptual
learning is observed to improve both the rate and extent of
learning in a subsequent sensorimotor adaptation task and the
benefits persist for at least 24 hours. The improvement in the
present studies is obtained regardless of whether the perceptual
boundary shift serves to systematically increase or decrease error
on subsequent movements. The beneficial effects of perceptual
training are found to be substantially dependent upon reinforced
decision-making in the sensory domain. Passive-movement training
on its own is less able to alter subsequent learning in the motor
system. Overall, this study suggests perceptual learning plays an
integral role in motor learning.
Bernardi NF, Darainy M, Bricolo E, Ostry DJ (2013) Observing motor learning produces somatosensory change. J Neurophysiol 110: 1804-1810.
Abstract - Article in PDF format (264 KB)
Observing the actions of others has been shown to affect motor
learning, but does it have effects on sensory systems as well? It has
been recently shown that motor learning that involves actual physical
practice is also associated with plasticity in the somatosensory system.
Here, we assessed the idea that observational learning likewise
changes somatosensory function. We evaluated changes in somatosensory
function after human subjects watched videos depicting motor
learning. Subjects first observed video recordings of reaching movements
either in a clockwise or counterclockwise force field. They were
then trained in an actual force-field task that involved a counterclockwise
load. Measures of somatosensory function were obtained before
and after visual observation and also following force-field learning.
Consistent with previous reports, video observation promoted motor
learning. We also found that somatosensory function was altered
following observational learning, both in direction and in magnitude,
in a manner similar to that which occurs when motor learning is
achieved through actual physical practice. Observation of the same
sequence of movements in a randomized order did not result in
somatosensory perceptual change. Observational learning and real
physical practice appear to tap into the same capacity for sensory
change in that subjects that showed a greater change following observational
learning showed a reliably smaller change following physical
motor learning. We conclude that effects of observing motor
learning extend beyond the boundaries of traditional motor circuits, to
include somatosensory representations.
Ito S, Darainy M, Sasaki M, Ostry DJ (2013) Computational model of motor learning and perceptual change. Biol Cybern 107:653-667.
Abstract - Article in PDF format (1.28 MB)
Motor learning in the context of arm reaching
movements has been frequently investigated using the paradigm
of force-field learning. It has been recently shown that
changes to somatosensory perception are likewise associated
with motor learning. Changes in perceptual function may be
the reason that when the perturbation is removed following
motor learning, the hand trajectory does not return to a
straight line path even after several dozen trials. To explain
the computational mechanisms that produce these characteristics,
we propose a motor control and learning scheme using
a simplified two-link system in the horizontal plane:We represent
learning as the adjustment of desired joint-angular trajectories
so as to achieve the reference trajectory of the hand.
The convergence of the actual hand movement to the reference
trajectory is proved by using a Lyapunov-like lemma,
and the result is confirmed using computer simulations. The
model assumes that changes in the desired hand trajectory
influence the perception of hand position and this in turn
affects movement control. Our computer simulations support
the idea that perceptual change may come as a result of
adjustments to movement planning with motor learning.
Nasir SM, Darainy M, Ostry DJ (2013) Sensorimotor adaptation changes the neural coding of somatosensory stimuli.
J. Neurophysiol. 109:2077-85.
Abstract -
Article in PDF format (692 KB)
Motor learning is reflected in changes to the brain’s functional organization as a result of experience. We show here that these changes are not limited to motor areas of the brain and indeed that motor learning also changes sensory systems. We test for plasticity in sensory systems using somatosensory evoked potentials (SEPs). A robotic device is used to elicit somatosensory inputs by displacing the arm in the direction of applied force during learning. We observe that following learning there are short latency changes to the response in somatosensory areas of the brain that are reliably correlated with the magnitude of motor learning: subjects who learn more show greater changes in SEP magnitude. The effects we observe are tied to motor learning. When the limb is displaced passively, such that subjects experience similar movements but without experiencing learning, no changes in the evoked response are observed. Sensorimotor adaptation thus alters the neural coding of somatosensory stimuli.
Mattar AAG, Darainy M, Ostry DJ (2013) Motor learning and its sensory effects: The time course of
perceptual change, and its presence with gradual introduction of load. J. Neurophysiol. 109:782-91.
Abstract -
Article in PDF format (413 KB)
A complex interplay has been demonstrated between motor and sensory systems. We showed recently that motor learning leads to changes in the sensed position of the limb (Ostry DJ, Darainy M, Mattar AA, Wong J, Gribble PL. J Neurosci 30: 5384–5393, 2010). Here, we document further the links between motor learning and changes in somatosensory perception. To study motor learning, we used a force field paradigm in which subjects learn to compensate for forces applied to the hand by a robotic device. We used a task in which subjects judge lateral displacements of the hand to study somatosensory perception. In a first experiment, we divided the motor learning task into incremental phases and tracked sensory perception throughout. We found that changes in perception occurred at a slower rate than changes in motor performance. A second experiment tested whether awareness of the motor learning process is necessary for perceptual change. In this experiment, subjects were exposed to a force field that grew gradually in strength. We found that the shift in sensory perception occurred even when awareness of motor learning was reduced. These experiments argue for a link between motor learning and changes in somatosensory perception, and they are consistent with the idea that motor learning drives sensory change.
Lametti DR, Nasir S, Ostry DJ (2012) Sensory preference in speech production revealed by simultaneous alteration
of auditory and somatosensory feedback. J Neurosci
32:9351-9359.
Abstract -
Article in PDF format (1403 KB)
The idea that humans learn and maintain accurate speech by carefully monitoring auditory feedback is widely held. But this view neglects the fact that auditory feedback is highly correlated with somatosensory feedback during speech production. Somatosensory feedback from speech movements could be a primary means by which cortical speech areas monitor the accuracy of produced speech. We tested this idea by placing the somatosensory and auditory systems in competition during speech motor learning. To do this, we combined two speech-learning paradigms to simultaneously alter somatosensory and auditory feedback in real time as subjects spoke. Somatosensory feedback was manipulated by using a robotic device that altered the motion path of the jaw. Auditory feedback was manipulated by changing the frequency of the first formant of the vowel sound and playing back the modified utterance to the subject through headphones. The amount of compensation for each perturbation was used as a measure of sensory reliance. All subjects were observed to correct for at least one of the perturbations, but auditory feedback was not dominant. Indeed, some subjects showed a stable preference for either somatosensory or auditory feedback during speech.
Rochet-Capellan A, Richer L, Ostry DJ (2012) Non-homogeneous transfer reveals specificity
in speech motor learning, J Neurophysiol 107(6):1711-1717.
Abstract -
Article in PDF format (461 MB)
Does motor learning generalize to new situations that are not experienced during training, or is motor learning essentially specific to the training situation? In the present experiments, we use speech production as a model to investigate generalization in motor learning. We tested for generalization from training to transfer utterances by varying the acoustical similarity between these two sets of utterances. During the training phase of the experiment, subjects received auditory feedback that was altered in real time as they repeated a single consonant vowel-consonant utterance. Different groups of subjects were trained with different consonant-vowel-consonant utterances, which differed from a subsequent transfer utterance in terms of the initial consonant or vowel. During the adaptation phase of the experiment, we observed that subjects in all groups progressively changed their speech output to compensate for the perturbation (altered auditory feedback). After learning, we tested for generalization by having all subjects produce the same single transfer utterance while receiving unaltered auditory feedback. We observed limited transfer of learning, which depended on the acoustical similarity between the training and the transfer utterances. The gradients of generalization observed here are comparable to those observed in limb movement. The present findings are consistent with the conclusion that speech learning remains specific to individual instances of learning.
Mattar AAG, Nasir SM, Darainy M, Ostry DJ (2011) Sensory change following motor learning. in Green AM,
Chapman CE, Kalaska JF and Lepore F (Eds), Progress in Brain Research, Volume 191
(pp 29-42).
Abstract -
Article in PDF format (1.10 MB)
Here we describe two studies linking perceptual change with motor learning. In the first, we
document persistent changes in somatosensory perception that occur following force field learning. Subjects learned to control a robotic device that applied forces to the hand during arm movements. This led to a change in the sensed position of the limb that lasted at least 24 h. Control experiments revealed that the sensory change depended on motor learning. In the second study, we describe changes in the perception of speech sounds that occur following speech motor learning. Subjects adapted control of speech movements to compensate for loads applied to the jaw by a robot. Perception of speech sounds was measured before and after motor learning. Adapted subjects showed a consistent shift in perception.
In contrast, no consistent shift was seen in control subjects and subjects that did not adapt to the load. These studies suggest that motor learning changes both sensory and motor function.
Vahdat S, Darainy M, Milner TE,
Ostry DJ (2011) Functionally specific changes in resting-state sensorimotor
networks after motor learning. J Neurosci. 31:16907–16915.
Abstract -
Article in PDF format (603 KB)
Motor learning changes the activity of cortical motor and subcortical areas of the brain, but does learning affect sensory systems as well?
We examined inhumansthe effects of motor learning using fMRI measures of functional connectivity under resting conditions and found
persistent changes in networks involving both motor and somatosensory areas of the brain. We developed a technique that allows us to
distinguish changes in functional connectivity that can be attributed to motor learning from those that are related to perceptual changes
that occur in conjunction with learning. Using this technique, we identified a new network in motor learning involving second somatosensory
cortex, ventral premotor cortex, and supplementary motor cortex whose activation is specifically related to perceptual changes
that occur in conjunction with motor learning. We also found changes in a network comprising cerebellar cortex, primary motor cortex,
and dorsal premotor cortex that were linked to the motor aspects of learning. In each network, we observed highly reliable linear
relationships between neuroplastic changes and behavioral measures of either motor learning or perceptual function. Motor learning
thus results in functionally specific changes to distinct resting-state networks in the brain.
Rochet-Capellan A,Ostry DJ (2011) Simultaneous acquisition of multiple
auditory-motor transformations in speech. J Neurosci. 31:2648-2655.
Abstract -
Article in PDF format (629 KB)
The brain easily generates the movement that is needed in a given situation. Yet surprisingly, the results of experimental studies suggest that it is difficult to acquire more than one skill at a time. To do so, it has generally been necessary to link the required movement to arbitrary cues. In the present study, we show that speech motor learning provides an informative model for the acquisition of multiple sensorimotor skills. During training, subjects were required to repeat aloud individual words in random order while auditory feedback was altered in real-time in different ways for the different words. We found that subjects can quite readily and simultaneously modify their speech movements to correct for these different auditory transformations. This multiple learning occurs effortlessly without explicit cues and without any apparent awareness of the perturbation. The ability to simultaneously learn several different auditory-motor transformations is consistent with the idea that, in speech motor learning, the brain acquires instance-specific memories. The results support the hypothesis that speech motor learning is fundamentally local.
Ito T, Ostry DJ (2010) Somatosensory contribution to motor learning due to facial skin deformation. J Neurophysiol
104:1230-1230.
Abstract -
Article in PDF format (248 KB)
Motor learning is dependent on kinesthetic information that is obtained
both
from cutaneous afferents and from muscle receptors. In human arm
movement, information from these two kinds of afferents is largely
correlated. The facial skin offers a unique situation in which there
are plentiful cutaneous afferents and essentially no muscle receptors
and, accordingly, experimental manipulations involving the facial skin
may be used to assess the possible role of cutaneous afferents in motor
learning. We focus here on the information for motor learning provided
by the deformation of the facial skin and the motion of the lips in the
context of speech. We used a robotic device to slightly stretch the
facial skin lateral to the side of the mouth in the period immediately
preceding movement. We found that facial skin stretch increased lip
protrusion in a progressive manner over the course of a series of
training trials. The learning was manifest in a changed pattern of lip
movement, when measured after learning in the absence of load. The
newly acquired motor plan generalized partially to another speech task
that involved a lip movement of different amplitude. Control tests
indicated that the primary source of the observed adaptation was
sensory input from cutaneous afferents. The progressive increase in lip
protrusion over the course of training fits with the basic idea that
change in sensory input is attributed to motor performance error.
Sensory input, which in the present study precedes the target movement,
is credited to the target-related motion, even though the skin stretch
is released prior to movement initiation. This supports the idea that
the nervous system generates motor commands on the assumption that
sensory input and kinematic error are in register.
Lametti DR, Ostry DJ (2010) Postural
constraint on
movement variability. J Neurophysiol
104:1061-1067.
Abstract -
Article in PDF format (2625 KB)
Movements are
inherently variable. When we move to a particular point in space, a
cloud of final limb positions is observed around the target. Previously
we noted that
patterns of variability at the end of movement
to a circular target were not circular, but instead reflected patterns
of limb stiffness—in directions where limb stiffness was high,
variability in end position was low, and vice versa. Here we examine
the determinants of variability at movement end in more detail. To do
this, we have subjects move the handle of a robotic device from
different starting positions into a circular target. We use position
servocontrolled displacements of the robot’s handle to measure limb
stiffness at the end of movement and we also record patterns of end
position variability. To examine the effect of change in posture on
movement variability, we use a visual motor transformation in which we
change the limb configuration and also the actual movement target,
while holding constant the visual display. We find that, regardless of
movement direction, patterns of variability at the end of movement vary
systematically with limb configuration and are also related to patterns
of limb stiffness, which are likewise configuration dependent. The
result suggests that postural configuration determines the base level
of movement variability, on top of which control mechanisms can act to
further alter variability.
Mattar AAG, Ostry DJ (2010) Generalization of dynamics learning across changes in
movement amplitude. J Neurophysiol
104:426-438.
Abstract -
Article in PDF format (552 KB)
Studies on generalization show the nature of how
learning is encoded
in the brain. Previous studies have shown rather limited generalization
of dynamics learning across changes in movement direction, a finding
that is consistent with the idea that learning is primarily local. In
contrast, studies show a broader pattern of generalization across
changes in movement amplitude, suggesting a more general form of
learning. To understand this difference, we performed an experiment
in which subjects held a robotic manipulandum and made movements
to targets along the body midline. Subjects were trained in a
velocitydependent
force field while moving to a 15 cm target. After training,
subjects were tested for generalization using movements to a 30 cm
target. We used force channels in conjunction with movements to the
30 cm target to assess the extent of generalization. Force channels
restricted lateral movements and allowed us to measure force production
during generalization. We compared actual lateral forces to the
forces expected if dynamics learning generalized fully. We found that,
during the test for generalization, subjects produced reliably less
force
than expected. Force production was appropriate for the portion of the
transfer movement in which velocities corresponded to those experienced
with the 15 cm target. Subjects failed to produce the expected
forces when velocities exceeded those experienced in the training
task. This suggests that dynamics learning generalizes little beyond
the range of one’s experience. Consistent with this result, subjects
who trained on the 30 cm target showed full generalization to the 15
cm target. We performed two additional experiments that show that
interleaved trials to the 30 cm target during training on the 15 cm
target can resolve the difference between the current results and those
reported previously.
Ostry DJ, Darainy M, Mattar
AAG, Wong J, Gribble PL (2010) Somatosensory plasticity and motor
learning. J Neurosci 30:5384-5393.
Abstract -
Article in PDF format (1016 KB)
- J Neurosci Journal Club
Commentary PDF format (210 KB)
- J Neurophysiol Neuro Forum Commentary PDF format (106 KB)
Motor learning is dependent upon plasticity in motor areas of the brain, but does it occur in isolation, or does it also result in changes to sensory systems? We examined changes to somatosensory function that occur in conjunction with motor learning. We found that even after periods of training as brief as 10 min, sensed limb position was altered and the perceptual change persisted for 24 h. The perceptual change was reflected in subsequent movements; limb movements following learning deviated from the prelearning trajectory by an amount that was not different in magnitude and in the same direction as the perceptual shift. Crucially, the perceptual change was dependent upon motor learning. When the limb was displaced passively such that subjects experienced similar kinematics but without learning, no sensory change was observed. The findings indicate that motor learning affects not only motor areas of the brain but changes sensory function as well.
Nasir SM, Ostry DJ (2009) Auditory
plasticity and speech motor learning. Proc Natl Acad Sci U S A
106:20470–20475.
Abstract
- Article in
PDF format(716 KB)
- Supporting Information
PDF format (101 KB)
- Commentary
PDF format (100 KB)
Is plasticity in sensory and motor
systems linked? Here, in the context of speech motor learning and
perception, we test the idea sensory function is modified by motor
learning and, in particular, that speech motor learning affects a
speaker’s auditory map. We assessed speech motor learning by using a
robotic device that displaced the jaw and selectively altered
somatosensory feedback during speech. We found that with practice
speakers progressively corrected for the mechanical perturbation and
after motor learning they also showed systematic changes in their
perceptual classification of speech sounds. The perceptual shift was
tied to motor learning. Individuals that displayed greater amounts of
learning also showed greater perceptual change. Perceptual change was
not observed in control subjects that produced the same movements, but
in the absence of a force field, nor in subjects that experienced the
force field but failed to adapt to the mechanical load. The perceptual
effects observed here indicate the involvement of the somatosensory
system in the neural processing of speech sounds and suggest that
speech motor learning results in changes to auditory perceptual
function.
Laboissière
R, Lametti DR, Ostry DJ (2009) Impedance control
and its
relation to precision in
orofacial movement. J Neurophysiol
102:523-531.
Abstract
- Article in PDF format (355 KB)
Speech production involves some of the most precise
and finely timed patterns of human movement. Here, in the context of
jaw movement in speech, we show that spatial precision in speech
production is systematically associated with the regulation of
impedance and in particular, with jaw stiffness—a measure of resistance
to displacement. We estimated stiffness and also variability during
movement using a robotic device to apply brief force pulses to the jaw.
Estimates of stiffness were obtained using the perturbed position and
force trajectory and an estimate of what the trajectory would be in the
absence of load. We estimated this “reference trajectory” using a new
technique based on Fourier analysis. A moving-average (MA) procedure
was used to estimate stiffness by modeling restoring force as the
moving average of previous jaw displacements. The stiffness matrix was
obtained from the steady state of the MA model. We applied this
technique to data from 31 subjects whose jaw movements were perturbed
during speech utterances and kinematically matched nonspeech movements.
We observed systematic differences in stiffness over the course of
jaw-lowering and jaw-raising movements that were correlated with
measures of kinematic variability. Jaw stiffness was high and
variability was low early and late in the movement when the jaw was
elevated. Stiffness was low and variability was high in the middle of
movement when the jaw was lowered. Similar patterns were observed for
speech and nonspeech conditions. The systematic relationship between
stiffness and variability points to the idea that stiffness regulation
is integral to the control of orofacial movement variability.
Darainy
M, Mattar AAG, Ostry DJ(2009) Effects of human arm impedance on dynamics
learning and generalization. J Neurophysiol 101:3158–3168.
Abstract
- Article in PDF format (408 KB)
Previous studies have demonstrated anisotropic
patterns of hand impedance under static conditions and during movement.
Here we show that the pattern of kinematic error observed in studies of
dynamics learning is associated with this anisotropic impedance
pattern. We also show that the magnitude of kinematic error associated
with this anisotropy dictates the amount of motor learning and,
consequently, the extent to which dynamics learning generalizes.
Subjects were trained to reach to visual targets while holding a
robotic device that applied forces during movement. On infrequent
trials, the load was removed and the resulting kinematic error was
measured. We found a strong correlation between the pattern of
kinematic error and the anisotropic pattern of hand stiffness. In a
second experiment subjects were trained under force-field conditions to
move in two directions: one in which the dynamic perturbation was in
the direction of maximum arm impedance and the associated kinematic
error was low and another in which the perturbation was in the
direction of low impedance where kinematic error was high.
Generalization of learning was assessed in a reference direction that
lay intermediate to the two training directions. We found that transfer
of learning was greater when training occurred in the direction
associated with the larger kinematic error. This suggests that the
anisotropic patterns of impedance and kinematic error determine the
magnitude of dynamics learning and the extent to which it generalizes.
Ito T, Tiede M, Ostry DJ (2009) Somatosensory function in speech perception.
Proc Natl Acad Sci U S A 106:1245–1248.
Abstract
- Article in PDF format
(240 KB)
Somatosensory signals from the facial skin and muscles of the vocal
tract provide a rich source of sensory input in speech production. We
show here that the somatosensory system is also involved in the perception
of speech. We use a robotic device to create patterns of facial skin
deformation that would normally accompany speech production. We find that
when we stretch the facial skin while people listen to words, it alters
the sounds they hear. The systematic perceptual variation we observe in
conjunction with speech-like patterns of skin stretch indicates that
somatosensory inputs affect the neural processing of speech sounds and
shows the involvement of the somatosensory system in the perceptual
processing in speech.
Nasir SM, Ostry DJ (2008) Speech motor learning in profoundly deaf
adults. Nat Neurosci 11:1217–1222.
Abstract
- Article in
PDF format (400 KB)
- Neuropod Podcast
- Dispatch
PDF format (64 KB)
Speech production, like other sensorimotor
behaviors, relies on multiple sensory inputs-audition, proprioceptive
inputs from muscle spindles and cutaneous inputs from
mechanoreceptors in the skin and soft tissues of the vocal tract. However, the
capacity for intelligible speech by deaf speakers suggests that somatosensory input alone may contribute to speech motor
control and perhaps even to speech learning. We assessed speech motor learning
in cochlear implant recipients who were tested with their implants turned off.
A robotic device was used to alter somatosensory
feedback by displacing the jaw during speech. We found that implant subjects
progressively adapted to the mechanical perturbation with training. Moreover,
the corrections that we observed were for movement deviations that were
exceedingly small, on the order of millimeters, indicating that speakers have
precise somatosensory expectations. Speech motor
learning is substantially dependent on somatosensory
input.
Darainy
M, Ostry DJ
(2008) Muscle cocontraction following dynamics learning. Exp Brain Res 190:153-163.
Abstract
- Article
in PDF format (1.0 MB)
Coactivation of antagonist
muscles is readily observed early in motor learning, in interactions with unstable
mechanical environments and in motor system pathologies. Here we present
evidence that the nervous system uses coactivation
control far more extensively and that patterns of cocontraction
during movement are closely tied to the specific requirements of the task. We
have examined the changes in cocontraction that
follow dynamics learning in tasks that are thought to involve finely sculpted feedforward adjustments to motor commands. We find that,
even following substantial training, cocontraction
varies in a systematic way that depends on both movement direction and the
strength of the external load. The proportion of total activity that is due to cocontraction nevertheless remains remarkably constant. Moreover,
long after indices of motor learning and electromyographic
measures have reached asymptotic levels, cocontraction
still accounts for a significant proportion of total muscle activity in all
phases of movement and in all load conditions. These results show that even
following dynamics learning in predictable and stable environments, cocontraction forms a central part of the means by which the
nervous system regulates movement.
Andres
M, Ostry DJ,
Nicol F, Paus T (2008) Time course of number magnitude
interference during grasping. Cortex 44:414-419.
Abstract
- Article in PDF format
(395 K)
In
the present study, we recorded the kinematics of grasping movements in order to
measure the possible interference caused by digits printed on the visible face
of the objects to grasp. The aim of this approach was to test the hypothesis
that digit magnitude processing shares common mechanisms with object size
estimate during grasping. In the first stages of reaching, grip aperture was
found to be larger consequent to the presentation of digits with a high value
rather than a low one. The effect of digit magnitude on grip aperture was more
pronounced for large objects. As the hand got closer to the object, the
influence of digit magnitude decreased and grip aperture progressively
reflected the actual size of the object. We concluded that number magnitude may
interact with grip aperture while programming the grasping movements.
Tremblay S, Houle G, Ostry
DJ (2008)
Specificity of speech motor
learning. J Neurosci 28:2426–2434.
Abstract
- Article
in PDF format (395 KB)
span class=GramE>The idea that the brain controls movement using a neural
representation of limb dynamics has been a dominant hypothesis in
motor control research for well over a decade. Speech
movements offer an unusual opportunity to test this proposal by
means of an examination of transfer of learning between utterances
that are to varying degrees matched on kinematics. If speech
learning results in a generalizable
dynamics representation, then, at the least, learning should
transfer when similar movements are embedded in phonetically
distinct utterances. We tested this idea using three different pairs
of training and transfer utterances that substantially overlap kinematically. We find that, with these stimuli,
speech learning is highly contextually sensitive and fails to
transfer even to utterances that involve very similar movements.
Speech learning appears to be extremely local, and the specificity
of learning is incompatible with the idea that speech control
involves a generalized dynamics representation.
Darainy
M, Towhidkhah F, Ostry DJ (2007) Control of
hand impedance under static conditions and during reaching movement. J
Neurophysiol 97:2676–2685.
Abstract
- Article in PDF format
(1890 KB)
It is known that humans can modify the impedance of the musculoskeletal
periphery, but the extent of this modification is uncertain. Previous
studies on impedance control under static conditions indicate a
limited ability to modify impedance, whereas studies of impedance
control during reaching in unstable environments suggest a greater
range of impedance modification. As a first step in accounting for
this difference, we quantified the extent to which stiffness changes
from posture to movement even when there are no destabilizing
forces. Hand stiffness was estimated under static conditions and at
the same position during both longitudinal (near to
far) and lateral movements using a position-servo technique. A new
method was developed to predict the hand "reference" trajectory
for purposes of estimating stiffness. For movements in a
longitudinal direction, there was considerable counterclockwise rotation
of the hand stiffness ellipse relative to stiffness under static
conditions. In contrast, a small counterclockwise rotation was
observed during lateral movement. In the modeling studies, even when
we used the same modeled cocontraction level during posture and
movement, we found that there was a substantial difference in the
orientation of the stiffness ellipse, comparable with that observed
empirically. Indeed, the main determinant of the orientation of the
ellipse in our modeling studies was the movement direction and the
muscle activation associated with movement. Changes in the
cocontraction level and the balance of cocontraction had smaller
effects. Thus even when there is no environmental instability, the
orientation of stiffness ellipse changes during movement in a manner
that varies with movement direction.
Lametti
DR, Houle G, Ostry DJ (2007) Control of
movement variability and the regulation of limb impedance. J
Neurophysiol 98:3516-3524.
Abstract
- Article in PDF format
(1257 KB)
Humans routinely make movements to targets that have different accuracy
requirements in different directions. Examples extend from everyday
occurrences such as grasping the handle of a coffee cup to the more
refined instance of a surgeon positioning a scalpel. The attainment
of accuracy in situations such as these might be related to the
nervous system's capacity to regulate the limb's resistance to
displacement, or impedance. To test this idea, subjects made
movements from random starting locations to targets that had
shape-dependent accuracy requirements. We used a robotic device to
assess both limb impedance and patterns of movement variability just
as the subject reached the target. We show that impedance increases
in directions where required accuracy is high. Independent of target
shape, patterns of limb stiffness are seen
to predict spatial patterns of movement variability. The nervous
system is thus seen to modulate limb impedance in entirely
predictable environments to aid in the attainment of reaching
accuracy.
Mattar AAG, Ostry
DJ (2007) Neural
averaging in motor learning. J Neurophysiol 97:220-228.
Abstract
- Article
in PDF format (462 KB)
The capacity for skill development over multiple training episodes is
fundamental to human motor function. We have studied the process by
which skills evolve with training by progressively modifying a
series of motor learning tasks that subjects performed over a 1-mo
period. In a series of empirical and modeling studies, we show that
performance undergoes repeated modification with new learning. Each
in a series of prior training episodes contributes such that present
performance reflects a weighted average of previous learning.
Moreover, we have observed that the relative weighting of skills
learned wholly in the past changes with time. This suggests that the
neural substrate of skill undergoes modification after
consolidation.
Mattar
AAG, Ostry DJ
(2007) Modifiability of generalization in dynamics learning. J
Neurophysiol 98:3321-3329.
Abstract
- Article
in PDF format (1155K)
Studies on plasticity in motor function have shown that motor learning
generalizes, such that movements in novel situations are affected by
previous training. It has been shown that the pattern of
generalization for visuomotor rotation learning
changes when training movements are made to a wide distribution of
directions. Here we have found that for dynamics learning, the shape
of the generalization gradient is not similarly modifiable by theextent of training within the workspace. Subjects learned to control
a robotic device during training and we measured how subsequent
movements in a reference direction were affected. Our results show
that as the angular separation between training and test directions
increased, the extent of generalization was reduced. When training
involved multiple targets throughout the workspace, the extent of
generalization was no greater than following training to the nearest
target alone. Thus a wide range of experience compensating for a
dynamics perturbation provided no greater benefit than localized
training. Instead, generalization was complete when training
involved targets that bounded the reference direction. This suggests
that broad generalization of dynamics learning to movements in novel
directions depends on interpolation between instances of localized
learning.
Nasir
SM, Ostry DJ
(2006) Somatosensory precision in speech production. Curr Biol
16:1918–1923.
Abstract
- Article in PDF
format (408 KB) - Supplemental
Data PDF format (152 KB)- Dispatch PDF format (132 KB)
Speech production is dependent on both auditory and somatosensory feedback.
Although audition may appear to be the dominant sensory modality in
speech production, somatosensory information plays a role that extends
from brainstem responses to cortical control. Accordingly, the motor
commands that underlie speech movements may have somatosensory as well
as auditory goals. Here we provide evidence that, independent of the
acoustics, somatosensory information is central to achieving the precision
requirements of speech movements. We were able to dissociate auditory
and somatosensory feedback by using a robotic device that altered the
jaw's motion path, and hence proprioception, without affecting speech
acoustics. The loads were designed to target either the consonant- or
vowel-related portion of an utterance because these are the major sound
categories in speech. We found that, even in the absence of any effect
on the acoustics, with learning subjects corrected to an equal extent
for both kinds of loads. This finding suggests that there are comparable
somatosensory precision requirements for both kinds of speech sounds.
We provide experimental evidence that the neural control of stiffness
or impedance--the resistance to displacement--provides for somatosensory
precision in speech production.
Darainy M, Malfait N,
Towhidkhah F, Ostry DJ
(2006) Transfer
and durability of acquired patterns of human arm stiffness. Exp Brain
Res 170:227-237.
Abstract
- Article
in PDF format (308 KB)
We used a robotic device to test the idea that impedance control involves
a process of learning or adaptation that is acquired over time and permits
the voluntary control of the pattern of stiffness at the hand. The tests
were conducted in statics. Subjects were trained over the course of three
successive days to resist the effects of one of three different kinds
of mechanical loads, single axis loads acting in the lateral direction,
single axis loads acting in the forward/backward direction and isotropic
loads that perturbed the limb in eight directions about a circle. We found
that subjects in contact with single axis loads voluntarily modified their
hand stiffness orientation such that changes to the direction of maximum
stiffness mirrored the direction of applied load. In the case of isotropic
loads, a uniform increase in endpoint stiffness was observed. Using a
physiologically realistic model of two-joint arm movement, the experimentally
determined pattern of impedance change could be replicated by assuming
that coactivation of elbow and double joint muscles was independent of
coactivation of muscles at the shoulder. Moreover, using this pattern
of coactivation control we were able to replicate an asymmetric pattern
of rotation of the stiffness ellipse that was observed empirically. The
present findings are consistent with the idea that arm stiffness is controlled
through the use of at least two independent cocontraction commands.
Shiller DM, Houle G, Ostry
DJ (2005)
Voluntary
control of human jaw stiffness. J
Neurophysiol 94:2207-2217.
Abstract
- Article
in PDF format (794 KB)
Recent studies of human arm movement have suggested that the control
of stiffness may be important both for maintaining stability and for achieving
differences in movement accuracy. In the present study, we have examined
the voluntary control of postural stiffness in 3D in the human jaw. The
goal is to address the possible role of stiffness control in both stabilizing
the jaw and in achieving the differential precision requirements of speech
sounds. We previously showed that patterns of kinematic variability in
speech are systematically related to the stiffness of the jaw. If the
nervous system uses stiffness control as a means to regulate kinematic
variation in speech, it should also be possible to show that subjects
can voluntarily modify jaw stiffness. Using a robotic device, a series
of force pulses was applied to the jaw to elicit changes in stiffness
to resist displacement. Three orthogonal directions and three magnitudes
of forces were tested. In all conditions, subjects increased the magnitude
of jaw stiffness to resist the effects of the applied forces. Apart from
the horizontal direction, greater increases in stiffness were observed
when larger forces were applied. Moreover, subjects differentially increased
jaw stiffness along a vertical axis to counteract disturbances in this
direction. The observed changes in the magnitude of stiffness in different
directions suggest an ability to control the pattern of stiffness of the
jaw. The results are interpreted as evidence that jaw stiffness can be
adjusted voluntarily, and thus may play a role in stabilizing the jaw
and in controlling movement variation in the orofacial system.
Malfait N, Gribble PL, Ostry
DJ
(2005)
Generalization of motor learning based on multiple field exposures and
local adaptation. J Neurophysiol
93:3327-3338.
Abstract
- Article
in PDF format (3976 KB)
Previous studies have used transfer of learning over workspace locations
as a means to determine whether subjects code information about dynamics
in extrinsic or intrinsic coordinates. Transfer has been observed when
the torque associated with joint displacement is similar between workspace
locations - rather than when the mapping between hand displacement and
force is preserved - which is consistent with muscle- or joint based encoding.
In the present study, we address the generality of an intrinsic coding
of dynamics and examine how generalization occurs when the pattern of
torques varies over the workspace. In two initial experiments, we examined
transfer of learning when the direction of a force field was fixed relative
to an external frame of reference. While there were no beneficial effects
of transfer following training at a single location (Experiment 1 and
2), excellent performance was observed at the center of the workspace
following training at two lateral locations (Experiment 2). Experiment
3 and associated simulations assessed the characteristics of this generalization.
In these studies, we examined the patterns of transfer observed following
adaptation to force fields that were composed of two subfields that acted
in opposite directions. The experimental and simulated data are consistent
with the idea that information about dynamics is encoded in intrinsic
coordinates. The nervous system generalizes dynamics learning by interpolating
between sets of control signals, each locally adapted to different patterns
of torques.
Della-Maggiore
V, Malfait N, Ostry DJ, Paus T (2004)
Stimulation of the posterior parietal cortex interferes with arm
trajectory adjustments during the learning of new dynamics. J Neurosci
24:9971-9976.
Abstract
- Article in PDF format (177
KB)
Substantial neurophysiological evidence points to the posterior parietal
cortex (PPC) as playing a key role in the coordinate transformation necessary
for visually guided reaching. Our goal was to examine the role of PPC in
the context of learning new dynamics of arm movements. We assessed this
possibility by stimulating PPC with transcranial magnetic stimulation (TMS)
while subjects learned to make reaching movements with their right hand
in a velocity-dependent force field. We reasoned that, if PPC is necessary
to adjust the trajectory of the arm as it interacts with a novel mechanical
system, interfering with the functioning of PPC would impair adaptation.
Single pulses of TMS were applied over the left PPC 40 msec after the onset
of movement during adaptation. As a control, another group of subjects was
stimulated over the visual cortex. During early stages of learning, the
magnitude of the error (measured as the deviation of the hand paths) was
similar across groups. By the end of the learning period, however, error
magnitudes decreased to baseline levels for controls but remained significantly
larger for the group stimulated over PPC. Our findings are consistent with
a role of PPC in the adjustment of motor commands necessary for adapting
to a novel mechanical environment.
Darainy M, Malfait N,
Gribble PL, Towhidkhah F, Ostry
DJ (2004) Learning to control arm stiffness under static
conditions. J Neurophysiol 92:3344-3350.
Abstract
- Article in PDF format
(267 KB)
We used a robotic device to test the idea that impedance control involves
a process of learning or adaptation that is acquired over time and permits
the voluntary control of the pattern of stiffness at the hand. The tests
were conducted in statics. Subjects were trained over the course of three
successive days to resist the effects of one of three different kinds
of mechanical loads, single axis loads acting in the lateral direction,
single axis loads acting in the forward/backward direction and isotropic
loads that perturbed the limb in eight directions about a circle. We found
that subjects in contact with single axis loads voluntarily modified their
hand stiffness orientation such that changes to the direction of maximum
stiffness mirrored the direction of applied load. In the case of isotropic
loads, a uniform increase in endpoint stiffness was observed. Using a
physiologically realistic model of two-joint arm movement, the experimentally
determined pattern of impedance change could be replicated by assuming
that coactivation of elbow and double joint muscles was independent of
coactivation of muscles at the shoulder. Moreover, using this pattern
of coactivation control we were able to replicate an asymmetric pattern
of rotation of the stiffness ellipse that was observed empirically. The
present findings are consistent with the idea that arm stiffness is controlled
through the use of at least two independent cocontraction commands.
Malfait N, Ostry
DJ (2004) Is
interlimb transfer of
force-field adaptation a "cognitive" response to the sudden
introduction of load? J Neurosci 24:8084-8089.
Abstract -
Article in PDF format
(342 KB)
Recently, Shadmehr and colleagues (Criscimagna-Hemminger et al. 2003)
reported a pattern of generalization of force-field adaptation between
arms that differs from the pattern that occurs across different configurations
of the same arm. While the intralimb pattern of generalization points
to an intrinsic encoding of dynamics, the interlimb transfer described
by these authors indicates that information about force is represented
in a frame of reference external to the body. In the present study, subjects
adapted to a viscous curl-field in two experimental conditions. In one
condition, the field was introduced suddenly and produced clear deviations
in hand paths; in the second condition, the field was introduced gradually
so that at no point during the adaptation process could subjects observe
or had to correct for a substantial kinematic error. In the first case,
a pattern of interlimb transfer consistent with Criscimagna-Hemminger
et al. was observed, whereas no transfer of learning between limbs occurred
in the second condition. The findings suggest that there is limited transfer
of fine compensatory force adjustment between limbs. Transfer when it
does occur may be largely the results of a "cognitive" strategy that arises
as a result of the sudden introduction of load and associated kinematic
error.
Petitto LA, Holowka S,
Sergio LE, Levy B, Ostry
DJ (2004) Baby hands that move to the rhythm of language:
hearing babies acquiring sign languages babble silently on the hands.
Cognition 93:43-73.
Abstract
- Article in PDF format
(382 KB)
The "ba, ba, ba" sound universal to babies' babbling around 7 months
captures scientific attention because it provides insights into the mechanisms
underlying language acquisition and vestiges of its evolutionary origins.
Yet the prevailing mystery is what is the biological basis of babbling,
with one hypothesis being that it is a non-linguistic motoric activity
driven largely by the baby's emerging control over the mouth and jaw,
and another being that it is a linguistic activity reflecting the babies'
early sensitivity to specific phonetic-syllabic patterns. Two groups of
hearing babies were studied over time (ages 6, 10, and 12 months), equal
in all developmental respects except for the modality of language input
(mouth versus hand): three hearing babies acquiring spoken language (group
1: "speech-exposed") and a rare group of three hearing babies acquiring
sign language only, not speech (group 2: "sign-exposed"). Despite this
latter group's exposure to sign, the motoric hypothesis would predict
similar hand activity to that seen in speech-exposed hearing babies because
language acquisition in sign-exposed babies does not involve the mouth.
Using innovative quantitative Optotrak 3-D motion-tracking technology,
applied here for the first time to study infant language acquisition,
we obtained physical measurements similar to a speech spectrogram, but
for the hands. Here we discovered that the specific rhythmic frequencies
of the hands of the sign-exposed hearing babies differed depending on
whether they were producing linguistic activity, which they produced at
a low frequency of approximately 1 Hz, versus non-linguistic activity,
which they produced at a higher frequency of approximately 2.5 Hz - the
identical class of hand activity that the speech-exposed hearing babies
produced nearly exclusively. Surprisingly, without benefit of the mouth,
hearing sign-exposed babies alone babbled systematically on their hands.
We conclude that babbling is fundamentally a linguistic activity and explain
why the differentiation between linguistic and non-linguistic hand activity
in a single manual modality (one distinct from the human mouth) could
only have resulted if all babies are born with a sensitivity to specific
rhythmic patterns at the heart of human language and the capacity to use
them.
Ostry DJ,
Feldman AG (2003) A
critical evaluation of the
force control hypothesis in motor control. Exp Brain Res 221:275-288.
Abstract
- Article in PDF format
(235 KB)
The ability to formulate explicit mathematical models of motor systems
has played a central role in recent progress in motor control research.
As a result of these modeling efforts and in particular the incorporation
of concepts drawn from control systems theory, ideas about motor control
have changed substantially. There is growing emphasis on motor learning
and particularly on predictive or anticipatory aspects of control that
are related to the neural representation of dynamics. Two ideas have become
increasingly prominent in mathematical modeling of motor function—forward
internal models and inverse dynamics. The notion of forward internal models
which has drawn from work in adaptive control arises from the recognition
that the nervous system takes account of dynamics in motion planning.
Inverse dynamics, a complementary way of adjusting control signals to
deal with dynamics, has proved a simple means to establish the joint torques
necessary to produce desired movements. In this paper, we review the force
control formulation in which inverse dynamics and forward internal models
play a central role. We present evidence in its favor and describe its
limitations. We note that inverse dynamics and forward models are potential
solutions to general problems in motor control—how the nervous system
establishes a mapping between desired movements and associated control
signals, and how control signals are adjusted in the context of motor
learning, dynamics and loads. However, we find little empirical evidence
that specifically supports the inverse dynamics or forward internal model
proposals per se. We further conclude that the central idea of the force
control hypothesis—that control levels operate through the central specification
of forces—is flawed. This is specifically evident in the context of attempts
to incorporate physiologically realistic muscle and reflex mechanisms
into the force control model. In particular, the formulation offers no
means to shift between postures without triggering resistance due to postural
stabilizing mechanisms.
Tremblay
S, Shiller DM,Ostry DJ (2003)
Somatosensory basis of speech production. Nature
423:866-869.
Abstract
- Article in PDF format
(341 KB) - Commentary in Nature Reviews
Neuroscience PDF format (387 KB)
The hypothesis that speech goals are defined acoustically and maintained
by auditory feedback is a central idea in speech production research.
An alternative proposal is that speech production is organized in terms
of control signals that subserve movements and associated vocal-tract
configurations. Indeed, the capacity for intelligible speech by deaf speakers
suggests that somatosensory inputs related to movement play a role in
speech production-but studies that might have documented a somatosensory
component have been equivocal. For example, mechanical perturbations that
have altered somatosensory feedback have simultaneously altered acoustics.
Hence, any adaptation observed under these conditions may have been a
consequence of acoustic change. Here we show that somatosensory information
on its own is fundamental to the achievement of speech movements. This
demonstration involves a dissociation of somatosensory and auditory feedback
during speech production. Over time, subjects correct for the effects
of a complex mechanical load that alters jaw movements (and hence somatosensory
feedback), but which has no measurable or perceptible effect on acoustic
output. The findings indicate that the positions of speech articulators
and associated somatosensory inputs constitute a goal of speech movements
that is wholly separate from the sounds produced.
Complete
listing