Journal papers
Ruiming Cao; Dekel Galor; Amit Kohli; Jacob L. Yates; Laura Waller
Noise2Image: noise-enabled static scene recovery for event cameras Journal Article
In: Optica, vol. 12, no. 1, pp. 46–55, 2025.
Abstract | Links | BibTeX | Tags: Beam splitters; Cameras; CMOS cameras; Fluorescence microscopy; Neural networks; Three dimensional reconstruction
@article{Cao:25b,
title = {Noise2Image: noise-enabled static scene recovery for event cameras},
author = {Ruiming Cao and Dekel Galor and Amit Kohli and Jacob L. Yates and Laura Waller},
url = {https://opg.optica.org/optica/abstract.cfm?URI=optica-12-1-46},
doi = {10.1364/OPTICA.538916},
year = {2025},
date = {2025-01-01},
journal = {Optica},
volume = {12},
number = {1},
pages = {46–55},
publisher = {Optica Publishing Group},
abstract = {Event cameras, also known as dynamic vision sensors, are an emerging modality for measuring fast dynamics asynchronously. Event cameras capture changes of log-intensity over time as a stream of ``events'' and generally cannot measure intensity itself; hence, they are only used for imaging dynamic scenes. However, fluctuations due to random photon arrival inevitably trigger noise events, even for static scenes. While previous efforts have been focused on filtering out these undesirable noise events to improve signal quality, we find that, in the photon-noise regime, these noise events are correlated with the static scene intensity. We analyze the noise event generation and model its relationship to illuminance. Based on this understanding, we propose a method, called Noise2Image, to leverage the illuminance-dependent noise characteristics to recover the static parts of a scene, which are otherwise invisible to event cameras. We experimentally collect a dataset of noise events on static scenes to train and validate Noise2Image. Our results show that Noise2Image can robustly recover intensity images solely from noise events, providing an approach for capturing static scenes in event cameras, without additional hardware.},
keywords = {Beam splitters; Cameras; CMOS cameras; Fluorescence microscopy; Neural networks; Three dimensional reconstruction},
pubstate = {published},
tppubtype = {article}
}
Event cameras, also known as dynamic vision sensors, are an emerging modality for measuring fast dynamics asynchronously. Event cameras capture changes of log-intensity over time as a stream of ``events'' and generally cannot measure intensity itself; hence, they are only used for imaging dynamic scenes. However, fluctuations due to random photon arrival inevitably trigger noise events, even for static scenes. While previous efforts have been focused on filtering out these undesirable noise events to improve signal quality, we find that, in the photon-noise regime, these noise events are correlated with the static scene intensity. We analyze the noise event generation and model its relationship to illuminance. Based on this understanding, we propose a method, called Noise2Image, to leverage the illuminance-dependent noise characteristics to recover the static parts of a scene, which are otherwise invisible to event cameras. We experimentally collect a dataset of noise events on static scenes to train and validate Noise2Image. Our results show that Noise2Image can robustly recover intensity images solely from noise events, providing an approach for capturing static scenes in event cameras, without additional hardware.