Группа авторов

The Handbook of Speech Perception


Скачать книгу

Human Perception and Performance, 26(3), 806–819.

      157 Sams, M., Manninen, P., Surakka, V., et al. (1998). McGurk effect in Finnish syllables, isolated words, and words in sentences: Effects of word meaning and sentence context. Speech Communication, 26(1–2), 75–87.

      158 Sanchez, K., Dias, J. W., & Rosenblum, L. D. (2013). Experience with a talker can transfer across modalities to facilitate lipreading. Attention, Perception & Psychophysics, 75, 1359–1365.

      159 Sanchez, K., Miller, R. M., & Rosenblum, L. D. (2010). Visual influences on alignment to voice onset time. Journal of Speech, Language, and Hearing Research, 53, 262–272.

      160 Santi, A., Servos, P., Vatikiotis‐Bateson, E., et al. (2003). Perceiving biological motion: Dissociating visible speech from walking. Journal of Cognitive Neuroscience, 15(6), 800–809.

      161 Sato, M., Buccino, G., Gentilucci, M., & Cattaneo, L. (2010). On the tip of the tongue: Modulation of the primary motor cortex during audiovisual speech perception. Speech Communication, 52(6), 533–541.

      162 Sato, M., Cavé, C., Ménard, L., & Brasseur, A. (2010). Auditory‐tactile speech perception in congenitally blind and sighted adults. Neuropsychologia, 48(12), 3683–3686.

      163 Schall, S., & von Kriegstein, K. (2014). Functional connectivity between face‐movement and speech‐intelligibility areas during auditory‐only speech perception. PLOS ONE, 9(1), 1–11.

      164 Schelinski, S., Riedel, P., & von Kriegstein, K. (2014). Visual abilities are important for auditory‐only speech recognition: Evidence from autism spectrum disorder. Neuropsychologia, 65, 1–11.

      165 Schwartz, J. L., Berthommier, F., & Savariaux, C. (2004). Seeing to hear better: Evidence for early audio‐visual interactions in speech identification. Cognition, 93(2), B69–78.

      166 Schweinberger, S. R., & Soukup, G. R. (1998). Asymmetric relationships among perceptions of facial identity, emotion, and facial speech. Journal of Experimental Psychology: Human Perception and Performance, 24, 1748–1765.

      167 Sekiyama, K., & Tohkura, Y. (1991). McGurk effect in non‐English listeners: Few visual effects for Japanese subjects hearing Japanese syllables of high auditory intelligibility. Journal of the Acoustical Society of America, 90(4), 1797–1805.

      168 Sekiyama, K., & Tohkura, Y. (1993). Inter‐language differences in the influence of visual cues in speech perception. Journal of Phonetics, 21(4), 427–444.

      169 Shams, L., Iwaki, S., Chawla, A., & Bhattacharya, J. (2005). Early modulation of visual cortex by sound: An MEG study. Neuroscience Letters, 378(2), 76–81.

      170 Shams, L., Wozny, D. R., Kim, R., & Seitz, A. (2011). Influences of multisensory experience on subsequent unisensory processing. Frontiers in Psychology, 2, 264.

      171 Shahin, A. J., Backer, K. C., Rosenblum, L. D., & Kerlin, J. R. (2018). Neural mechanisms underlying cross‐modal phonetic encoding. Journal of Neuroscience, 38(7), 1835–1849.

      172 Sheffert, S. M., Pisoni, D. B., Fellowes, J. M., & Remez, R. E. (2002). Learning to recognize talkers from natural, sinewave, and reversed speech samples. Journal of Experimental Psychology: Human Perception and Performance, 28(6), 1447–1469.

      173 Simmons, D. C., Dias, J. W., Dorsi, J. & Rosenblum, L. D. (2015). Crossmodal transfer of talker learning. Poster presented at the 169th meeting of the Acoustical Society of America, Pittsburg, Pennsylvania, May.

      174 Skipper, J. I., Nusbaum, H. C., & Small, S. L. (2005). Listening to talking faces: Motor cortical activation during speech perception. NeuroImage, 25(1), 76–89.

      175 Skipper, J. I., van Wassenhove, V., Nusbaum, H. C., & Small, S. L. (2007). Hearing lips and seeing voices: How cortical areas supporting speech production mediate audiovisual speech perception. Cerebral Cortex, 17(10), 2387–2399.

      176 Soto‐Faraco, S., & Alsius, A. (2007). Conscious access to the unisensory components of a crossmodal illusion. NeuroReport, 18, 347–350.

      177 Soto‐Faraco, S., & Alsius, A. (2009). Deconstructing the McGurk–MacDonald illusion. Journal of Experimental Psychology: Human Perception and Performance, 35, 580–587.

      178 Stoffregen, T. A., & Bardy, B. G. (2001). On specification and the senses. Behavioral and Brain Sciences, 24(2), 195–213.

      179 Strand, J., Cooperman, A., Rowe, J., & Simenstad, A. (2014). Individual differences in susceptibility to the McGurk effect: Links with lipreading and detecting audiovisual incongruity, Journal of Speech, Language, and Hearing Research, 57, 2322–2331.

      180 Striem‐Amit, E., Dakwar, O., Hertz, U., et al. (2011). The neural network of sensory‐substitution object shape recognition. Functional Neurology, Rehabilitation, and Ergonomics, 1(2), 271–278.

      181 Sumby, W. H., & Pollack, I. (1954). Visual contribution to speech intelligibility in noise. Journal of the Acoustical Society of America, 26(2), 212–215.

      182 Summerfield, Q. (1987). Some preliminaries to a comprehensive account of audio‐visual speech perception. In B. Dodd & R. Campbell (Eds), Hearing by eye: The psychology of lip‐reading (pp. 53–83). London: Lawrence Erlbaum.

      183 Summerfield, Q., & McGrath, M. (1984). Detection and resolution of audiovisual incompatibility in the perception of vowels. Quarterly Journal of Experimental Psychology, 36A, 51–74.

      184 Sundara, M., Namasivayam, A. K., & Chen, R. (2001). Observation‐execution matching system for speech: A magnetic stimulation study. NeuroReport, 12(7), 1341–1344.

      185 Swaminathan, S., MacSweeney, M., Boyles, R., et al. (2013). Motor excitability during visual perception of known and unknown spoken languages. Brain and Language, 126(1), 1–7.

      186 Teinonen, T., Aslin, R. N., Alku, P., & Csibra, G. (2008). Visual speech contributes to phonetic learning in 6‐month‐old infants. Cognition, 108(3), 850–855.

      187 Thomas, S. M., & Jordan, T. R. (2002). Determining the influence of Gaussian blurring on inversion effects with talking faces. Perception & Psychophysics, 64, 932–944.

      188 Tiippana, K. (2014). What is the McGurk effect? Frontiers in Psychology, 5, 725–728.

      189 Tiippana, K., Andersen, T. S., & Sams, M. (2004). Visual attention modulates audiovisual speech perception. European Journal of Cognitive Psychology, 16(3), 457–472.

      190 Treille, A., Cordeboeuf, C., Vilain, C., & Sato, M. (2014). Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions. Neuropsychologia, 57(1), 71–77.

      191 Treille, A., Vilain, C., & Sato, M. (2014).The sound of your lips: Electrophysiological cross‐modal interactions during hand‐to‐face and face‐to‐face speech perception. Frontiers in Psychology, 5, 1–8.

      192  Turner, T. H., Fridriksson, J., Baker, J., et al. (2009). Obligatory Broca’s area modulation associated with passive speech perception. Neuroreport, 20(5), 492–496.

      193 Uno, T., Kawai, K., Sakai, K., et al. (2015). Dissociated roles of the inferior frontal gyrus and superior temporal sulcus in audiovisual processing: Top‐down and bottom‐up mismatch detection. PLOS ONE, 10(3).

      194 van de Rijt, L. P. H., van Opstal, A. J., Mylanus, E. A. M., et al. (2016). Temporal cortex activation to audiovisual speech in normal‐hearing and cochlear implant users measured with functional near‐infrared spectroscopy. Frontiers in Human Neuroscience, 10, 1–14.

      195 Van Engen, K. J., Xie, Z., & Chandrasekaran, B. (2016). Audiovisual sentence recognition is not predicted by susceptibility to the McGurk effect. Attention, Perception, &Psychophysics, 79, 396– 403.

      196 van Wassenhove, V. (2013). Speech through ears and eyes: Interfacing the senses with the supramodal brain. Frontiers in Psychology, 4, 1–17.

      197 van Wassenhove, V., Grant, K. W., & Poeppel, D. (2007). Temporal window of integration in auditory‐visual speech perception. Neuropsychologia, 45(3), 598–607.

      198 van