Word identification and eye fixation locations in visual and visual-plus-auditory presentations of spoken sentences (original) (raw)
References
Abry, C., Lallouache, M.-T., &Cathiard, M.-A. (1996). How can coarticulation models account for speech sensitivity in audio-visual desynchronization? In D. Stork & M. Henneke (Eds.),Speechreading by humans and machines: Models, systems, and applications(NATO ASI Series, Vol. 150, Series F: Computer and Systems Sciences, pp. 247–255). Berlin: Springer-Verlag. Google Scholar
Althoff, R. R., &Cohen, N. J. (1999). Eye-movement-based memory effect: A reprocessing effect in face perception.Journal of Experimental Psychology: Learning, Memory, & Cognition,25, 997–1010. Article Google Scholar
Althoff, R. R., Cohen, N. J., McConkie, G. W., Wasserman, S., Maciukenas, M., Azen, R., &Romine, L. (1999). Eye movementbased memory assessment. In W. Becker, H. Deubel, & T. Mergner (Eds.),Current oculomotor research: Physiological and psychological aspects (pp. 292–302). New York: Plenum. Google Scholar
American National Standards Institute. (1996).Specifications for audiometers (ANSI S36-1996). New York: Author. Google Scholar
Argyle, M., &Cook, M. (1976).Gaze and mutual gaze. Cambridge: Cambridge University Press. Google Scholar
Bernstein, L. E., Coulter, D. C., O’Connell, M. P., Eberhardt, S. P., &Demorest, M. E. (1993). Vibrotactile and haptic speech codes. In A. Risberg, S. Felicetti, G. Plant, & K.-E. Spens (Eds.),Proceedings of the Second International Conference on Tactile Aids, Hearing Aids, and Cochlear Implants (pp. 57–70). Stockholm: Kungliga Tekniska Högskolan. Google Scholar
Bernstein, L. E., Demorest, M. E., Coulter, D. C., &O’Connell, M. P. (1991). Lipreading sentences with vibrotactile vocoders: Performance of normal-hearing and hearing-impaired subjects.Journal of the Acoustical Society of America,95, 3617–3622. Article Google Scholar
Bernstein, L. E., Demorest, M. E., &Tucker, P. E. (2000). Speech perception without hearing.Perception & Psychophysics,62, 233–252. Google Scholar
Bernstein, L. E., &Eberhardt, S. P. (1986).Johns Hopkins Lipreading Corpus I-II: Disc I [Laser video disc]. Baltimore: Johns Hopkins University. Google Scholar
Bridgeman, B., &Palca, J. (1980). The role of microsaccades in high acuity observational tasks.Vision Research,20, 813–817. ArticlePubMed Google Scholar
Calvert, G., Bullmore, E., Brammer, M., Campbell, R., Woodruff, P., McGuire, P., Williams, S., Iversen, S. D., &David, A. S. (1997). Activation of auditory cortex during silent speechreading.Science,276, 593–596. ArticlePubMed Google Scholar
Campbell, R. (1998). Everyday speechreading: Understanding seen speech in action.Scandinavian Journal of Psychology,39, 163–167. ArticlePubMed Google Scholar
Campbell, R., &Dodd, B. (1980). Hearing by eye.Quarterly Journal of Experimental Psychology,32, 85–99. ArticlePubMed Google Scholar
Campbell, R., &Dodd, B. (1982). Some suffix effects on lipread lists.Canadian Journal of Psychology,36, 509–515. Google Scholar
Computer Teaching Corporation. (1994).TenCore LAS (Language authoring system: Operating system manual, Version 5.2) [Computer software]. Champaign, IL: Author. Google Scholar
Crowder, R. G., &Morton, J. (1969). Precategorical acoustic storage (PAS).Perception & Psychophysics,5, 365–373. Google Scholar
Cunitz, R. J., &Steinman, R. M. (1969). Comparison of saccadic eye movements during fixation and reading.Vision Research,9, 683–693. ArticlePubMed Google Scholar
Davis, H., &Silverman, S. R. (1970).Hearing and deafness. New York: Holt, Rinehart & Winston. Google Scholar
Findlay, J. M., &Walker, R. (1999). A model of saccade generation based on parallel processing and competitive inhibition.Behavioral & Brain Sciences,22, 661–721. Google Scholar
Finlay, D. (1992). Motion perception in the peripheral visual field.Perception,11, 457–472. Article Google Scholar
Greenberg, H. J., &Bode, D. L. (1968). Visual discrimination of consonants.Journal of Speech & Hearing Research, 11, 466–471. Google Scholar
Ijsseldijk, F. J. (1992). Speechreading performance under different conditions of video image, repetition, and speech rate.Journal of Speech & Hearing Research,35, 466–477. Google Scholar
Kramer, A. F., Cassavaugh, N. D., Irwin, D. E., Peterson, M. S., &Hahn, S. (2001). Influence of single and multiple onset distractors on visual search for singleton targets.Perception & Psychophysics,63, 952–968. Article Google Scholar
Kricos, P. B., &Lesner, S. A. (1982). Differences in visual intelligibility across talkers.Volta Review,84, 219–225. Google Scholar
Kuhl, P. K., &Meltzoff, A. N. (1982). The bimodal perception of speech in infancy.Science,218, 1138–1141. ArticlePubMed Google Scholar
Kuhl, P. K., &Meltzoff, A. N. (1984). The intermodal representation of speech in infants.Infant Behavior & Development,7, 361–381. Article Google Scholar
Lansing, C. R., &McConkie, G. W. (1994). A new method for speechreading research.Journal of the Academy of Rehabilitative Audiology,27, 25–43. Google Scholar
Lansing, C. R., &McConkie, G. W. (1999). Attention to facial regions in segmental and prosodic visual speech perception tasks.Journal of Speech, Language, & Hearing Research,42, 526–539. Google Scholar
Levy-Schoen, A. (1981). Flexible and/or rigid control of oculomotor scanning behavior. In D. F. Fischer, R. A. Monty, & J. W. Senders (Eds.),Eye movements: Cognition and visual perception (pp. 299–318). Hillsdale, NJ: Erlbaum. Google Scholar
Marassa, L. K., &Lansing, C. R. (1995). Visual word recognition in two facial motion conditions: Full-face versus lips-plus-mandible.Journal of Speech & Hearing Research,38, 1387–1394. Google Scholar
Mason, M. K. (1943). A cinematographic technique for testing visual speech comprehension.Journal of Speech Disorders,8, 271–278. Google Scholar
Massaro, D. W. (1998).Perceiving talking faces: From speech perception to a behavioral principle. Cambridge, MA: MIT Press, Bradford Books. Google Scholar
McGurk, H., &MacDonald, J. (1976). Hearing lips and seeing voices.Nature,264, 746–748. ArticlePubMed Google Scholar
Miller, G. A., &Nicely, P. E. (1955). An analysis of perceptual confusions among some English consonants.Journal of the Acoustical Society of America,72, 338–352. Article Google Scholar
Posner, M. I. (1980). Orienting attention.Quarterly Journal of Experimental Psychology,32, 3–25. ArticlePubMed Google Scholar
Posner, M. I., &Raichle, M. E. (1994).Images of mind. New York: Freeman. Google Scholar
Preminger, J. E., Lin, H.-B., Payen, M., &Levitt, H. (1998). Selective masking in speechreading.Journal of Speech, Language, & Hearing Research,41, 564–575. Google Scholar
Reinagel, P., &Zador, A. M. (1999). Natural scene statistics at the centre of gaze.Network-Computation in Neural Systems,10, 341–350. Article Google Scholar
Rosenblum, L. D., Johnson, J. A., &Saldaña, H. M. (1996). Visual kinematic information for embellishing speech in noise.Journal of Speech & Hearing Research,39, 1159–1170. Google Scholar
Saldaña, H. M., Nygaard, L. C., &Pisoni, D. P. (1996). Episodic encoding of visual speaker attributes and recognition memory for spoken words. In D. Stork & M. Henneke (Eds.),Speechreading by humans and machines: Models, systems, and applications (NATO ASI Series, Vol. 150, Series F: Computer and Systems Sciences, pp. 275–281). Berlin: Springer-Verlag. Google Scholar
Sheffert, S. M., &Fowler, C. A. (1995). The effects of voice and visible speaker change on memory for spoken words.Journal of Memory & Language,34, 665–685. Article Google Scholar
Sumby, W. H., &Pollack, I. (1954). Visual contributions to speech intelligibility in noise.Journal of the Acoustical Society of America,26, 212–215. Article Google Scholar
Summerfield, Q. (1987). Some preliminaries to a comprehensive account of audio-visual speech perception. In B. Dodd & R. Campbell (Eds.),Hearing by eye: The psychology of lipreading (pp. 3–52). Hillsdale, NJ: Erlbaum. Google Scholar
Vatikiotis-Bateson, E., Eigsti, I.-M., &Yano, S. (1994a). Listener eye movement behavior during audiovisual perception.Proceedings of the Acoustical Society of Japan,94-3, 679–680. Google Scholar
Vatikiotis-Bateson, E., Eigsti, I.-M., &Yano, S. (1994b). Listener eye movement behavior during audiovisual speech perception. In_Proceedings of ICSLP 94: International Conference on Spoken Language Processing_ (Vol. 2, pp. 527–530). Tokyo: Acoustical Society of Japan. Google Scholar
Vatikiotis-Bateson, E., Eigsti, I.-M., Yano, S., &Munhall, K. (1998). Eye movements of perceivers during audiovisual speech perception.Perception & Psychophysics,60, 926–940. Google Scholar
Weiner, B. J. (1962).Statistical principles in experimental design. New York: McGraw-Hill. Book Google Scholar
Yang, S.-N., &McConkie, G. W. (2001). Eye movements during reading: A theory of saccade initiation times.Vision Research,41, 3567–3585. ArticlePubMed Google Scholar
Yarbus, A. L. (1967).Eye movements and vision. New York: Plenum. Google Scholar
Yehia, H., Rubin, P., &Vatikiotis-Bateson, E. (1998). Quantitative association of vocal-tract and facial behavior.Speech Communication,26, 23–43. Article Google Scholar