This issue is to research options for external autofocus that can be added to any camera and (manual) lens.
There are different technologies to measure the distance to an object:
Main article: https://en.wikipedia.org/wiki/Range_imaging
- LIDAR
- RADAR
- optical flow detection
- Stereo image triangulation
- Coded aperture
The most accurate technology seems LIDAR, that's why it's used in self-driving cars.
There are some commercial lidar systems.
- DJI Ronin 3D Focus System (159 €, range when filming a person is about 6 meters)
- Blickfeld Cube (made for industrie applications like self-driving cars)
- Velodyne Velabit
- Livox Horizon (€1,199)
- Lumotive
- Arducam Time of Flight (ToF) Camera for Raspberry Pi (MSRP 50USD)
Range:
Far Mode: 4m
Near Mode: 2m
Accuracy:
Far mode:±2cm,
Near mode: ±4cm
Lens FOV:
70° Diagonal (equivalent to a 30mm lens on full frame sensor)
- 3D Time of Flight ToF Camera - DepthVista (See3CAM_TOF_25CUG) (Sample 699USD)
Depth operating mode/range:
Near mode: 0.2m to 1.2m
Far mode: 1m to 6m
Depth accuracy: Upto 1% (e.g. at 50cm you will get +or- 5mm error)
FOV:84.29° (H) x 64.14° (V) x 99.75°(D) (with the lens provided by e-con)
Has RGB Camera as well with FHD - 1920 x 1080 @30fps
Linux software support
It would be awesome if we could build an open hardware solid-state LIDAR. Maybe there are some research papers we could implement. We could collaborate with open hardware self-driving car projects. Maybe some university is interested in such a project.
One practical use of such a system for video would be conference recording. Today a camera person has to follow the speaker and keep focus. With a system to move the camera like a gimbal or robot arm and LIDAR focus it can be fully automated!