Dear colleagues,
I have installed the ov9281 and worked through the quick setup guide. Instead of using the dtoverlay, I have installed the MIPI_Camera/RPI repository from github. The camera is found and the result of the command
v4l2-ctl --stream-mmap --stream-count=-1 -d /dev/video0 --stream-to=/dev/null
appears as in the screenshot in the guide.
Everything in the quick start guide works, up until “First Use”. I cannot get video, or any image, to appear on the client computer (running ubuntu 18.04)
I am trying to stream video from my headless raspberry pi 4b to an ubuntu computer. Both have the latest installation of gstreamer. The two computers communicate wirelessly over a local network router using ssh. For purposes of setting this up, I have disconnected my router from the internet and disabled the firewalls on both computers, so port access issues aren’t a problem.
I have tried both versions of the code in the README.md file in the /RPI folder.
Example 1 process and results:
First the Pi to start the stream:
`pi://~MIPI_Camera/RPI $ ./video2stdout | nc - l -p 5000
Open camera…
Found sensor ov9281 at address 60
init camera status = 4100`
The the Ubuntu machine to play the stream:
`ubuntu:~$ gst-launch-1.0 -v tcpclientsrc host=192.168.1.40 port=5000 ! decodebin ! autovideosink sync = false
Setting pipeline to PAUSED …
Pipeline is PREROLLING …`
and that’s it. No PLAYING and no video. It just hangs up in that status.
I know they are connected, since when I ctrl+c on the Pi side, to stop the camera stream, the ubuntu side also terminates with the following output:
`ERROR: from element /GstPipeline:pipelline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind: Stream doesn’t contain enough data.
Additional debug info:
gsttypefindelement.c(996): gst_type_find_element_chain_do_typefinding (): /GstPipeline:pipeline0(GstDecodeBin: decodebin0/GstTypeFindElement:typefind:
Can’t typefind stream
ERROR: pipeline doesn’t want to preroll
Setting pipeline to NULL…
Freeing pipeline…`
Trying Example 2:
initiating on the Pi side:
`pi://~MIPI_Camera/RPI $ ./video2stdout | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt =96 ! gdppay ! tcpserversink host=19.168.1.40 port=5000
Open camera …
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
/GstPipeline:pipeline0/GstTCPServerSink:tcpserversink0: current-port = 5000
Found sensor ov9281 at address 60`
Pi pauses for about 1 second then continues
`Pipeline is PREROLLED …
Setting pipeline to PLAYING …
New clock: GstSystemClock
Got EOS from element “pipeline0”
Execution ended after 0:00:00.001462016
setting pipeline to PAUSED …
setting pipeline to READY …
setting pipeline to NULL …
Freeing pipeline …
pi://~MIPI_Camera/RPI $`
Doing my best “fastest draw in the West” act, I tried to start the client side script as quickly as possible. Here’s what that looks like:
`user $ gst-launch-1.0 -v tcpclientsrc host=192.168.1.40 port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! autovideosink sync=false
Setting pipeline to PAUSED …
Setting pipeline to PREROLLING …
Pipeline is PREROLLED
Setting pipline to PLAYING …
New clock: GstSystemClock
GOT EOS from element “pipeline0”
Execution ended after 0:00:00.000599961
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …
user $`
With Version 2, it seems to have connected and played the stream for some trivial fraction of second. Any ideas what is going on here? This is really difficult to debug.
Thank you