OV2311 low fps with opencv

Hello,

I want to use the OV2311 camera for a tracking project in python with opencv. But I am running into problems getting the expected framerate (Something close to 60 fps).

Using auto-exposure (which I prefer to manual) I get around 7 fps. Using manual exposure I get around 12 fps for an exposure of 5000. And up to 26 fps, and a black frame, setting the exposure to 0.

I use python 2.7.16 and opencv 4.2.0 and have tested op the RPi3 B+ and RPi4

The code I am using: https://github.com/richard-bbb/RPi-cam-feed

I don’t know the fps of the capture2opencv.cpp or capture2opencv.py from the Arducam repo, but they look about the same. Also all three give me the following statement in the terminal in response to calling the set_resolution function from the arducam_mipicamera module:

>mmal: Failed to fix lens shading, use the default mode!

So I guess my question is: How can I up my fps so it’s useable for fast motion tracking with opencv?

Hello,

Don’t worry and and I will try my best to help you.

Firstly, arducam has released aducamstill demo https://github.com/ArduCAM/MIPI_Camera/blob/master/RPI/arducamstill.c

This demo just get image and display.You can use this demo to test if the sensor can up to 60 fps.

After ensure the sensor configuration can up to 60 fps. Let us check why the frame speed will low using opencv.

A mainly factor is that, For the opencv lib, it should get the image and color conversion through the CPU instead of the GPU(arducamstill use GPU), which will need many time.

 

Using arducamstill the framerate is indeed 60 fps. In Python just using the mipi_camera.capture() class method, gives me 60 fps more than half of the time, but sometimes it gets stuck at a value between around 30-45 fps. This is without using the opencv lib, or any other I/O operation at all.

I also tried threading to separate capturing the frame from the OpenCV operations, but that doesn’t seem to up the framerate, and also gives me a lot of segmentation faults. (This script is also in the Github repo I previously linked)

Could you help me figuring out what might be causing the drops in framerate using the capture() method, and do you know a way to speed things up while using OpenCV?

Thanks for your help!

Hi,

For the Capture() API. It is through the CPU to get the graph, there is a memory copy, so the frame rate will be reduced.

While using arducamstill, all the processes are done by the GPU and without memory copy.

So the speed is the sensor’s real speed.

Adding isolcpus=3 to /boot/cmdline.txt and invoking using taskset -c 3 chrt 99 python live_feed.py also has no effect. I am assuming I2C speed has nothing to do with the frame rate.

Hi. I have also used the ov2311 and the 60fps cant be achieved in any real use case.
Since it uses the Jetvariety drivers the highest fps with exposure 1 is <20FPS.

Then adding AI-models to this cuts it down to 6-7fps. Of course the further processing is not Arducam’s fault, but it is not possible to use python and get 60fps.

If Arducam example script cant get 60FPS it is ridiculous to talk about 60FPS and full resolution. Also worth mentioning that I do not use gstreamer but v4l2 directly to read.

The platform used is Jetson Xavier NX…