create acquisition / output realtime control software
Open, Needs TriagePublic

Description

<se6astian> We need a small tool that runs with real-time priority on Linux
<se6astian> and in one mode that we define should trigger the new image every n ms depending on framerate
<se6astian> then we can also define the output framerate, triggering a new frame over HDMI to be displayed
<se6astian> both triggers take a memory address where a new frame should be stored and read if I understood it correctly
<se6astian> this means we can have different fps on input/output
<se6astian> but can also do burst high speed acquisition and then slow output of stored frames
<se6astian> but it needs a central piece of software that manages this
<se6astian> so I wanted to discuss what modes of operation we could need
<se6astian> or what way we could ensure maximum flexibility with configuration

<Bertl> what we changed with the gateware basically affects a few (only partially related) things
<Bertl> - we moved the I/O address spaces around to group them according to priority
<Bertl> - we changed the address generator and with it the register layout
<Bertl> - we (mostly) removed the sequencer from the gateware
<Bertl> before the changes we had a quad buffering which got managed by the sequencer
<Bertl> means, the capture would be triggered everytime the previous image started transfering via HDMI
<Bertl> and the HDMI buffer changed whenever the previous capture was complete
<Bertl> both pipelines are now independent and only use a single buffer
<Bertl> interrupts were added to communicate HDMI and capture state to userspace
<Bertl> so basically userspace can act on an interrupt for e.g. capture complete with changing the buffer address and starting a new capture
<Bertl> the HDMI output still runs quasi automatically because we cannot realy 'stop' there
<Bertl> but the address used for the buffer can be changed and will take effect on the next frame
<Bertl> that's basically the setup for now
<Bertl> what I plan to add in the near future is a way to tie two frames together into a single HDMI picture
<Bertl> either by using the interlace or the 3D setup
<Bertl> this will hopefully get rid of the A/B problematics

<vup> Can this 'program' be nctrl or a kernel module that is interfaced by nctrl

Example applications:

  • Acquisition running at continuous standard frame-rate (eg. 24/25/30), same on HDMI output. -> Sync acquisition and output to make sure we never drop a frame.
  • Acquisition running at continuous standard frame-rate (eg. 50), different frame rate on HDMI output (eg. 30 FPS). -> drop frames at continuos intervals on output
  • burst capture (eh. smaller sensor area or binned) images a high frame-rate, after defined number stop acquisition and output images at defined FPS on output
  • dynamically change acquisition frame-rate based on some inputs/rules, maintain steady output flow at standard frame-rate as new frames arrive

how to interface with it?
IPC, like sockets, is the way to go

Note that we will also need a new snap tool with this altered gateware to write images to file system.

Sources:
http://vserver.13thfloor.at/Stuff/AXIOM/BETA/cmv_hdmi4/
http://vserver.13thfloor.at/Stuff/AXIOM/BETA/LAM/

sebastian created this task.May 1 2022, 6:59 PM
sebastian updated the task description. (Show Details)
sebastian updated the task description. (Show Details)May 1 2022, 7:07 PM
sebastian updated the task description. (Show Details)
sebastian updated the task description. (Show Details)May 1 2022, 7:23 PM
sebastian updated the task description. (Show Details)May 1 2022, 7:39 PM
sebastian updated the task description. (Show Details)May 1 2022, 7:52 PM