In a paper titled “Burst Vision Using Single-Photon Cameras”, Sizhuo Ma, Paul Mos, Edoardo Charbon and Mohit Gupta from University of Wisconsin-Madison and École polytechnique fédérale de Lausanne write:
Single-photon avalanche diodes (SPADs) are novel image sensors that record the arrival of individual photons at extremely high temporal resolution. In the past, they were only available as single pixels or small-format arrays, for various active imaging applications such as LiDAR and microscopy. Recently, high-resolution SPAD arrays up to 3.2 megapixel have been realized, which for the first time may be able to capture sufficient spatial details for general computer vision tasks, purely as a passive sensor. However, existing vision algorithms are not directly applicable on the binary data captured by SPADs. In this paper, we propose developing quanta vision algorithms based on burst processing for extracting scene information from SPAD photon streams. With extensive real-world data, we demonstrate that current SPAD arrays, along with burst processing as an example plug-and-play algorithm, are capable of a wide range of downstream vision tasks in extremely challenging imaging conditions including fast motion, low light ($<5$ lux) and high dynamic range. To our knowledge, this is the first attempt to demonstrate the capabilities of SPAD sensors for a wide gamut of real-world computer vision tasks including object detection, pose estimation, SLAM, and text recognition. We hope this work will inspire future research into developing computer vision algorithms in extreme scenarios using single-photon cameras.
Dealing with motion blur in extremely low light
Dealing with extreme dynamic range
A large dataset of over 50 million binary burst frames for a wide range of computer vision tasks
The paper will be presented at the upcoming Winter Conference on Applications of Computer Vision (WACV) conference in January 2023.
Full paper is available here: https://wisionlab.com/wp-content/uploads/2022/11/burst_vision_wisionlab.pdf