Electrical Engineering & Computer Science (EECS)
Energy-Efficient Multimedia Systems Group (EEMS)
Vivienne Sze, Associate Professor, EECS (Thesis Advisor)
Prof. Berthold K.P. Horn, Professor, EECS
Dr. Charles Mathy, Engineer, Analog Devices
Depth sensing is useful for many applications that range from augmented reality to robotic navigation. Time-of-flight (ToF) cameras are appealing depth sensors because they obtain dense depth maps with minimal latency. However, for mobile and embedded devices, ToF cameras, which obtain depth by emitting light and estimating its roundtrip time, can be power-hungry and limit the battery life of the underlying device. To reduce the power for depth sensing, we present algorithms to address two scenarios.
For applications where images are concurrently collected, we present algorithms that reduce the usage of the ToF camera and estimate new depth maps without illuminating the scene. We exploit the fact that many applications operate in nearly rigid environments, and our algorithms use the sparse correspondences across the consecutive images to estimate the rigid motion and use it to obtain new depth maps. Our techniques can reduce the usage of the ToF camera by up to 85%, while still estimating new depth maps within 1% of the ground truth for rigid scenes and 1.74% for dynamic ones. Compared to applications that just use the ToF camera and incur the cost of higher sensor power, and to those that estimate depth entirely using monocular images, which are inaccurate and obtained with low latency, our approach balances the ToF camera usage with computation to obtain accurate depth maps with minimal latency.
When only the data from a ToF camera is used, we propose algorithms that reduce the overall amount of light that the ToF camera emits to obtain accurate depth maps. Our techniques use the rigid motions in the scene, which can be estimated using the infrared images that a ToF camera obtains, to temporally mitigate the impact of noise. We show that our approaches can reduce the amount of emitted light by up to 81% and the mean relative error of the depth maps by up to 64%. Our algorithms are all computationally efficient and can obtain dense depth maps in near real-time, and in some cases real-time, on standard and embedded platforms. Our work taken together enables energy-efficient depth sensing for many emerging applications.
James completed his PhD under the supervision of Prof. Vivienne Sze in May 2020. For his doctoral research, he proposed algorithms and systems to lower the power of time-of-flight depth cameras. Time-of-flight cameras are becoming ubiquitous in many new mobile devices and enable applications that range from augmented reality to robotic navigation. However, these sensors are power-hungry and reduce the battery life of many mobile devices. The ideas presented in this thesis can be used to lower the power of these sensors while still maintaining the acquisition of low-latency and accurate depth maps.