I would like to be able to access a low resolution uncompressed RGB (24bit) video stream from the Linux userspace that has a low enough bandwidth requirement to be streamed over the network or is low enough resolution that one of the zynq ARM cores can compress it in real-time with existing software (eg. ffmpeg, vlc, etc.) inside the camera.
A 512x288 ( 4096 / 8) image created by skipping pixels would be perfectly fine IMHO.
Bandwidth requirements estimate:
Size per image: 432kB
Bandwidth requirement for 25 FPS: 10.5 Mbyte/s
This live video feed could be packaged into an RTSP stream and then be viewed on any network connected device.
Thoughts?