The exposure time is aprox 200 us, so nowhere near explains the delay in the obtained image.That said, my initial thoughts are that the trigger will start exposure, so you've inherently got the exposure time and readout time before you get the EOF and hence a buffer to pass through GStreamer. Typical ISPs on camera boards have minimal buffering (a few lines), so that shouldn't be adding any significant delay.
Not really, I did managed to get a return using VideoCapture(0) but couldnt manage to get the response I wanted.Is there a good reason for going through GStreamer? OpenCV's VideoCapture should be able to directly interface with simple V4L2 devices, although it does have a tendency to always convert the image format which can be CPU intensive.
That sounds like a good idea.All V4L2 devices should timestamp the incoming frames, generally with the system monotonic clock. If you tweak your code so that the Pi generates the external trigger and you read the monotonic clock at that point, you can then compare it with the buffer timestamp. Note that the buffer timestamp is for the Start of Frame interrupt, so you can also read the clock when your application gets the buffer to estimate the readout time.
Speaking of other sensors, has anyone tried the OV9281 and has actually measured the latency? Many thanks
Statistics: Posted by lgcs2500 — Fri Feb 02, 2024 2:30 pm