- Undergraduate
- Graduate
- Research
- News & Events
- People
- Inclusivity
- Jobs
Back to Top Nav
Back to Top Nav
Back to Top Nav
I will describe a series of AR system engineering challenges and associated perceptual studies designed to provide guidance for navigating these challenges.
Abstract: Advances in personal computing often go hand-in-hand with advances in display technology. Today's augmented reality (AR) systems aim to seamlessly merge our computer’s display into our view of the natural world. But AR systems often suffer from technical and visual limitations, such as small eyeboxes, limited brightness, and narrow visual field coverage – not to mention poor aesthetics. An integral part of AR system development, therefore, is perceptual research that improves our understanding of when and why these limitations matter for the user. I will describe a series of AR system engineering challenges and associated perceptual studies designed to provide guidance for navigating these challenges. Our work highlights the idiosyncrasies of how the visual system integrates information from the two eyes, the complexities of quantifying visual field coverage, and the unique puzzles that must be solved for a computer display to be worn on your face.
Bio:
Emily Cooper is an associate professor at the University of California, Berkeley in the Herbert Wertheim School of Optometry & Vision Science and the Helen Wills Neuroscience Institute. Her research spans 3D vision, display system design, assistive technology, and computational neuroscience. She has a B.A. in English Language & Literature from the University of Chicago, a Ph.D. in Neuroscience from the University of California, Berkeley, and she completed her postdoctoral research at Stanford University. She is currently chair of the interdisciplinary Vision Science PhD Program, a co-Director of the Center for Innovation in Vision & Optics, and a Visiting Faculty Researcher at Google.
Events are free and open to the public unless otherwise noted.