Cognition, Perception, and Cognitive Neuroscience
Jack received his BA from Johns Hopkins University in 1967 and his PhD from the University of Michigan in 1971. After a three year postdoc at Smith-Kettlewell Instititute of Visual Sciences in San Francisco, he joined the Psychology faculty at UCSB. He retired as a professor in 2009 and closed his lab in 2012. He is author of over 160 research publications and chapters and was the Principal Investigator on 12 multiyear grants from NIH, NSF, ONR, and AFOSR. In 1991, he and graduate student Andrew Beall began developing virtual reality technology for use in basic psychological research. In 1998, Jack and Professor Jim Blascovich founded the Research Center for Virtual Environments and Behavior (ReCVEB). Jack also initiated and led a 20-year interdisciplinary research project with collaborators Professor Reginald Golledge in Geography and Professor Roberta Klatzky in Psychology (now at Carnegie-Mellon University). The project involved applied research and development of a GPS-based navigation system for blind people as well as basic research on spatial cognition and spatial hearing.
Jack’s research career was distinguished by a rare breadth of scientific interests—vision, touch, hearing and spatial cognition. His early career focused on touch and color vision. In 1987 he shifted his focus to a concern with visual and auditory space perception, visual control of flying and driving, and spatial cognition in blind and sighted people. Much of this research employed virtual reality, as developed by Andrew Beall and himself.
His 20-year collaborative project with Professors Reginald Golledge and Roberta Klatzky on research and development of a GPS-based navigation system for blind people was premised on Jack’s idea of using virtual sound as the interface with the blind user—allowing the user to hear the synthetically spoken labels of waypoints and environmental points of interest as appearing to come from the surrounding environment. While the UCSB project did not result in a practical device by the time it came to a close in 2008, the Soundscape project at Microsoft Research, led by Amos Miller, is now pursuing the feasibility of using virtual sound as such an interface for blind people.
Much of Jack’s research, especially that involving virtual reality, was motivated by an interest in the phenomenology of perception and the recognition that the everyday world we experience is a mental representation of the underlying physical world. In his retirement, Jack continues his interest in the phenomenology of perception and consciousness with his study of nonduality, including Buddhism.
Loomis, J. M., Da Silva, J.A., Fujita, N., & Fukusima, S. S. (1992). Visual space perception and visually directed action. Journal of Experimental Psychology: Human Perception and Performance, 18, 906-921.
Loomis, J. M. & Philbeck, J. W. (2008). Measuring spatial perception with spatial updating and action. In R. L. Klatzky, M. Behrmann, & B. MacWhinney (Eds.), Embodiment, ego-space, and action (pp 1-43). New York: Taylor & Francis.
Beall, A. C. & Loomis, J. M. (1997). Optic flow and visual analysis of the base-to-final turn. The International Journal of Aviation Psychology, 7, 201-223.
Loomis, J. M. & Beall, A. C. (1998). Visually-controlled locomotion: Its dependence on optic flow, 3-D space perception, and cognition. Ecological Psychology, 10, 271-285.
Loomis, J. M., Klatzky, R. L., Golledge, R. G., Cicinelli, J. G., Pellegrino, J. W., & Fry, P. A. (1993). Nonvisual navigation by blind and sighted: Assessment of path integration ability. Journal of Experimental Psychology: General, 122, 73-91.
Chance, S. S., Gaunet, F., Beall, A. C., & Loomis, J. M. (1998). Locomotion mode affects the updating of objects encountered during travel: The contribution of vestibular and proprioceptive inputs to path integration. Presence, 7, 168-178.
Loomis, J. M., Klatzky, R. L., Golledge, R. G., & Philbeck, J. W. (1999). Human navigation by path integration. In R. G. Golledge (Ed.), Wayfinding: Cognitive mapping and other spatial processes (pp. 125-151). Baltimore: Johns Hopkins.
Loomis, J. M., Klatzky, R. L., & Giudice, N. A. (2013). Representing 3D space in working memory: Spatial images from vision, hearing, touch, and language. In S. Lacey & R. Lawson (Eds.), Multisensory imagery: Theory and applications (pp. 131-155). New York: Springer.
Loomis, J. M. (1992). Distal attribution and presence. Presence, 1, 113-119.
Loomis, J. M., Blascovich, J.J., & Beall, A. C. (1999). Immersive virtual environment technology as a basic research tool in psychology. Behavior Research Methods, Instruments, and Computers, 31, 557-564.
Loomis, J. M., Golledge, R. G., Klatzky, R. L., Speigle, J. M., & Tietz, J. (1994). Personal guidance system for the visually impaired. Proceedings of the First Annual ACM/SIGGAPH Conference on Assistive Technologies, Marina Del Ray, CA, October 31-November 1, 1994, pp. 85-91. New York: Association for Computing Machinery.
Loomis, J. M., Golledge, R. G., & Klatzky, R. L. (1998). Navigation system for the blind: Auditory display modes and guidance. Presence, 7, 193-203.
Loomis, J. M., Klatzky, R. L., & Giudice, N. A. (2012). Sensory substitution of vision: Importance of perceptual and cognitive processing. In R. Manduchi & S. Kurniawan (Eds.) Assistive Technology for Blindness and Low Vision, pp. 162-191. Boca Raton: FL: CRC Press.
Loomis, J. M. & Berger, T. (1979). Effects of chromatic adaptation on color discrimination and color appearance. Vision Research, 19, 891-901.
Loomis, J. M. (1990). A model of character recognition and legibility. Journal of Experimental Psychology: Human Perception and Performance, 16, 106-120.