In this keynote held at the 2024 International Conference on Computational Photography, Prof. Davide Scaramuzza from the University of Zurich presents a visionary keynote on event cameras, which are bio-inspired vision sensors that outperform conventional cameras with ultra-low latency, high dynamic range, and minimal power consumption. He dives into the motivation behind event-based cameras, explains how these sensors work, and explores their mathematical modeling and processing frameworks. He highlights cutting-edge applications across computer vision, robotics, autonomous vehicles, virtual reality, and mobile devices while also addressing the open challenges and future directions shaping this exciting field.
00:00 - Why event cameras matter to robotics and computer vision
07:24 - Bandwidth-latency tradeoff
08:24 - Working principle of the event camera
10:50 - Who sells event cameras
12:27 - Relation between event cameras and the biological eye
13:19 - Mathematical model of the event camera
15:35 - Image reconstruction from events
18:32 - A simple optical-flow algorithm
20:20 - How to process events in general
21:28 - 1st order approximation of the event generation model
23:56 - Application 1: Event-based feature tracking
25:03 - Application 2: Ultimate SLAM
26:30 - Application 3: Autonomous navigation in low light
27:38 - Application 4: Keeping drones fly when a rotor fails
31:06 - Contrast maximization for event cameras
34:14 - Application 1: Video stabilization
35:16 - Application 2: Motion segmentation
36:32 - Application 3: Dodging dynamic objects
38:57 - Application 4: Catching dynamic objects
39:41 - Application 5: High-speed inspection at Boeing and Strata
41:33 - Combining events and RGB cameras and how to apply deep learning
45:18 - Application 1: Slow-motion video
48:34 - Application 2: Video deblurring
49:45 - Application 3: Advanced Driving Assistant Systems
56:34 - History and future of event cameras
58:42 - Reading material and Q&A
0 Response to "ICCP 2024 Keynote on Event Cameras"
Post a Comment