訊息公告

【論文研討】2018/03/28_15:30~17:30, Towards Investigating the Effects of Stereoscopic Voxalization, Adaptive Focal Plane Shifting, and Depth of Field Rendering on Near-Field Distance Perception in MR Displays

Towards Investigating the Effects of Stereoscopic Voxalization, Adaptive Focal Plane Shifting, and Depth of Field Rendering on Near-Field Distance Perception in MR Displays

Time:2018/03/28  15:30~17:30

Place:ED117

Speaker:Sabarish V. Babu, Associate Professor, Clemson University

Speaker Bio:

Sabarish “Sab” Babu is an Associate Professor in the Division of Human Centered Computing in the School of Computing at Clemson University in the USA. He received his BS (2000), MS (2002) and PhD (2007) degrees from the University of North Carolina at Charlotte, and completed a post-doctoral fellowship in the Department of Computer Science at the University of Iowa prior to joining Clemson University in 2010. His research interests are in the areas of virtual environments, applied perception in graphics and visualization, educational virtual reality, and 3D human computer interaction. He has authored or co-authored over 70 peer-reviewed publications in premiere venues sponsored by IEEE and ACM. He was the General Chair of the IEEE International Conference on Virtual Reality (IEEE VR) 2016, the premiere international venue for VR research. He also served as a Program Chair for IEEE VR 2017. He and his students have received Best Paper Awards in the IEEE International Conference on 3D User Interaction, ACM Symposium on Applied Perception, and the IEEE International Conference on Healthcare Informatics. His research has been sponsored by the US National Science Foundation, US Department of Labor, St. Francis and Medline Medical Foundations, and he has received over US $1.5 million in external funding as a PI or Co-PI on several research projects related to virtual reality.

Abstract:

For the past eight years, my research group has been rigorously investigating near field depth perception and visuo-motor recalibration in head mounted display VR and real-world viewing situations. Distance perception is a fundamental information for higher order perception such as size, scale, shape, speed estimation, and for accurate fine motor task performance in virtual world simulations. Directly measuring depth perception in the human brain is very difficult, instead researchers infer distance estimation via immediate fine motor actions to the perceived stimuli. We conducted an empirical evaluation to examine to what extent near field distance estimation (to one’s maximum reach) is accurate in real and immersive virtual environments. We found that participants misperceive egocentric distances in virtual reality viewing as compared to real world viewing, even when the virtual environment is a carefully modeled accurate replica of the real world environment. In further experiments, we investigated how visuo-motor recalibration can enhance egocentric distance perception in virtual reality viewing with visual and haptic feedback in VR. We also examined how scaling the operator’s fine motor actions to the visually perceived location of targets in VR, differentially affects near field distance estimation. More recently, we have found that the visual realism of immersive self-embodiment via self-avatars can enhance near field distance estimation in interactions in VR, as compared to low fidelity self-embodiment metaphors such as only joint location visualization of the operator or visualizations of the end-effector only.

These previous studies have investigated how properties of 3D interaction in VR affects near field distance estimation. However, from a viewing perspective, high resolution stereoscopic displays or novel configurations of multiple large screen stereoscopic displays, such as CAVEs, have the potential to increase near field stereoscopic voxel density and resolution. This in turn has the potential to improve near field depth perception, and is an open area of research. Also, dynamically co-locating the virtual focal plane at the location of the viewing target, or at the binocular convergence distance, based on adaptively changing the focal length of the lens in the HMD viewing pipeline has the potential to minimize the vergence-accommodation conflict and enhance near field depth perception in VR. Additionally, the incorporation of perceptual illusions such as depth-of-field rendering in the virtual world can potentially calibrate near field distance perception, and lead to effective motor task performance in activities such as selection and manipulation by physical reaching in applications like surgical simulation.