Title: Human-Centered Visual Computing: Harnessing Symbiosis Between Computer Architecture, Imaging, and Biological Perception
Speaker: Dr. Yuhao Zhu
Date and Time: Thursday, October 12th, 11:00 am
Location: EH 2430
Emerging platforms such as Augmented Reality (AR), Virtual Reality (VR), and autonomous machines, while are of a computing nature, intimately interact with both the environment and humans. They must be built, from the ground up, with principled considerations of three main components: imaging, computer systems, and human perception. This talk will make a case for this tenet and discuss some of our recent work on this front. I will first talk about in-sensor visual computing, the idea that co-designing the image sensor with the computer systems will significantly improve the overall system efficiency and, perhaps more importantly, unlock new machine capabilities. We will show a number of case studies in AR/VR and autonomous machines. I will then discuss our work on human-systems co-optimizations, where we 1) exploit human color vision to build energy-efficient AR/VR devices and 2) computationally “cure” color blindness without gene therapy.
Yuhao Zhu is an Assistant Professor in the Department of Computer Science and a Bridging Fellow faculty in the Department of Brain and Cognitive Sciences at University of Rochester. He graduated from The University of Texas at Austin, and previously held visiting researcher positions at Arm Research and Harvard University. Yuhao Zhu does software and hardware research for visual computing, an exciting area where a challenging
problem in computing might become significantly easier when one considers how computing interacts with conventionally non-computing domains such as imaging and human perception at the system level.
Hosted By: Prof. Hyoukjun Kwon