Installing Arducam Jetvariety driver does not work on brand new Jetson Nano

Published by Margarito Lehmann on

Home Forums Camera Modules for NVIDIA Jetson Nano Installing Arducam Jetvariety driver does not work on brand new Jetson Nano

  • This topic has 10 replies, 3 voices, and was last updated 1 month ago by wong.
Viewing 10 reply threads
  • Author
    Posts
    • #23050
      Margarito Lehmann
      Participant

      From Eirikur:
      Error states: [email protected]:~/camera/MIPI_Camera/Jetson/Jetvariety/driver$ sudo apt install ./arducam-nvidia-l4t-kernel_4.9.140-32.3.1-20200331173033_arm64.deb … arducam-nvidia-l4t-kernel : PreDepends: nvidia-l4t-ccp-t210ref but it is not installable

      Arducam Support Reply:
      The deb installation package of the MIPI driver on Jetson Nano supports L4T32.3.1 which is JetPack 4.3.
      Our engineer just updated it and make it support L4T32.4.2 in commit 462c762.

      From Eirikur:
      Thanks for the quick response! I had a feeling this was the case so I downgraded my nano with the older JetPack and got both UC-599 cameras to work. Obviously I want to use the latest JetPack so I will test the new commit later today and get back to you.

      On a related note. Are there any more examples on programmatically controlling the cameras? I’m planning to use the external trigger function to slightly offset capturing on the two cameras and combine their frames for high speed object analysis. I’m wondering if the same thing could be achieved in code without the physical triggering?

      Also, any good examples on streaming the cameras with gstreamer or similar realtime streaming would be appreciated. The nano might not be powerful enough to train the deep learning model I’m doing so my backup plan is to stream the frames to a Jetson TX2 or Xavier for training and inference.

      Arducam Support Reply:
      1. The external trigger function can only work with physical triggering at the present.
      2. Sorry to let you know that there is no gstreamer examples, we will keep you posted if there is any good news from them.

      From Eirikur:
      Thanks for that update but where is the documention for the api/features of the monochrome jetson nano cameras like the UC-599 camera board? There is just one example of using that on github and that is only for the external trigger function…
      e.g. howcan I configure it for the max fps, do single frame captures in code, find out which pixelformat is supported etc… am I missing some general Arducam examples section that works for all cameras?

      Looking for something like this or better…

      Camera Userland Driver SDK and Examples

      I can confirm that the new driver works on JetPack 4.4. However I notice a strange effect now when recording a stream in VLC. The video recording has double width. This didn’t happen before with JetPack 4.3 and the older driver. See screenshot. Also…I mentioned that I needed more examples if you can share. In the screenshot you can see I’m trying to use “jetson-inference” project but I’m not having success. Note that I have 2 cameras connected but was only showing one when I used the record function in VLC.
      screenshot

    • #23095
      eikish
      Participant

      I can confirm now that the new driver on Github works on JetPack 4.4.

      However I notice a strange effect now when recording a stream in VLC. The video recording has double width (half of it is black). This didn’t happen before with JetPack 4.3 and the older driver. See attached screenshot. Note that I have 2 cameras connected but was only showing one when I used the record function in VLC.

      Also…I mentioned that I needed more coding examples. These high speed cameras are great but USELESS if one cannot use them for training models or inferring like other cameras on the Jetson platform. I have a Nano, TX2 and Xavier if you guys need a good alpha/beta tester. A good priority would be to get Nvidia’s official “Two days to a demo” working on Jetson Nano with your Jetvariety cameras. This is what most developers will try first (works out of the box on TX2 with the built in camera).

      See here: https://developer.nvidia.com/embedded/twodaystoademo
      and more specifically here is the code to compile with lots of examples to run on various pre-trained neural networks:
      https://github.com/dusty-nv/jetson-inference

      The output of the examples seems to recognize and initialize the cameras ok but the whole thing freezes when it tries to display the camera feed live.

      Here’s how you get up and running with it:

      git clone –recursive https://github.com/dusty-nv/jetson-inference
      cd jetson-inference/
      mkdir build
      cd build/
      cmake ../
      # go through wizards
      make
      sudo make install
      sudo ldconfig
      # probably reboot just in case…

      Thanks!

    • #23096
      eikish
      Participant

      I can confirm now that the new driver on Github works on JetPack 4.4.

      However I notice a strange effect now when recording a stream in VLC. The video recording has double width (half of it is black). This didn’t happen before with JetPack 4.3 and the older driver. See attached screenshot. Note that I have 2 cameras connected but was only showing one when I used the record function in VLC.

      Also…I mentioned that I needed more coding examples. These high speed cameras are great but USELESS if one cannot use them for training models or inferring like other cameras on the Jetson platform. I have a Nano, TX2 and Xavier if you guys need a good alpha/beta tester. A good priority would be to get Nvidia’s official “Two days to a demo” working on Jetson Nano with your Jetvariety cameras. This is what most developers will try first (works out of the box on TX2 with the built in camera).

      See here: https://developer.nvidia.com/embedded/twodaystoademo
      and more specifically here is the code to compile with lots of examples to run on various pre-trained neural networks:
      https://github.com/dusty-nv/jetson-inference

      The output of the examples seems to recognize and initialize the cameras ok but the whole thing freezes when it tries to display the camera feed live.

      Here’s how you get up and running with it:

      git clone –recursive https://github.com/dusty-nv/jetson-inference
      cd jetson-inference/
      mkdir build
      cd build/
      cmake ../
      # go through wizards
      make
      sudo make install
      sudo ldconfig
      # probably reboot just in case…

      Thanks!

    • #23097
      eikish
      Participant

      I can confirm now that the new driver on Github works on JetPack 4.4.

      However I notice a strange effect now when recording a stream in VLC. The video recording has double width (half of it is black). This didn’t happen before with JetPack 4.3 and the older driver. See attached screenshot. Note that I have 2 cameras connected but was only showing one when I used the record function in VLC.

      Also…I mentioned that I needed more coding examples. These high speed cameras are great but USELESS if one cannot use them for training models or inferring like other cameras on the Jetson platform. I have a Nano, TX2 and Xavier if you guys need a good alpha/beta tester. A good priority would be to get Nvidia’s official “Two days to a demo” working on Jetson Nano with your Jetvariety cameras. This is what most developers will try first (works out of the box on TX2 with the built in camera).

      See here: https://developer.nvidia.com/embedded/twodaystoademo
      and more specifically here is the code to compile with lots of examples to run on various pre-trained neural networks:
      https://github.com/dusty-nv/jetson-inference

      The output of the examples seems to recognize and initialize the cameras ok but the whole thing freezes when it tries to display the camera feed live.

      Here’s how you get up and running with it:

      git clone –recursive https://github.com/dusty-nv/jetson-inference
      cd jetson-inference/
      mkdir build
      cd build/
      cmake ../
      # go through wizards
      make
      sudo make install
      sudo ldconfig
      # probably reboot just in case…

      Thanks!

    • #23098
      eikish
      Participant

      I can confirm now that the new driver on Github works on JetPack 4.4.
      However I notice a strange effect now when recording a stream in VLC. The video recording has double width (half of it is black). This didn’t happen before with JetPack 4.3 and the older driver. See attached screenshot. Note that I have 2 cameras connected but was only showing one when I used the record function in VLC.

      Also…I mentioned that I needed more coding examples. These high speed cameras are great but USELESS if one cannot use them for training models or inferring like other cameras on the Jetson platform. I have a Nano, TX2 and Xavier if you guys need a good alpha/beta tester. A good priority would be to get Nvidia’s official “Two days to a demo” working on Jetson Nano with your Jetvariety cameras. This is what most developers will try first (works out of the box on TX2 with the built in camera).
      See here: https://developer.nvidia.com/embedded/twodaystoademo
      and more specifically here is the code to compile with lots of examples to run on various pre-trained neural networks:
      https://github.com/dusty-nv/jetson-inference
      The output of the examples seems to recognize and initialize the cameras ok but the whole thing freezes when it tries to display the camera feed live.
      Here’s how you get up and running with it:
      git clone –recursive https://github.com/dusty-nv/jetson-inference
      cd jetson-inference/
      mkdir build
      cd build/
      cmake ../
      # go through wizards
      make
      sudo make install
      sudo ldconfig
      # probably reboot just in case…

      Thanks!

    • #23128
      wong
      Moderator

      Is the vlc playback normal? Does the problem occur only when recording?

    • #23156
      eikish
      Participant

      @wong Yes the playback is normal. The extra width is only in the recorded file.

    • #23169
      wong
      Moderator

      Hi @eikish ,

      Then I don’t think it’s a problem of the camera itself. This may be a problem of vlc. Can you tell me how you recorded it with vlc? I want to try to reproduce this problem.

      We will continue to follow up on “Two days to a demo” and “jetson-inference”, but it will take some time.

    • #23228
      eikish
      Participant

      thanks @wong

      see my Issue on GitHub for more information on the jetson inference (gstreamer issue)

      https://github.com/ArduCAM/MIPI_Camera/issues/44

      It’s like it almost works. Inference is the reason why people buy Jetson Nano.

      On the VLC capture strangeness…I actually cannot reproduce it now. I may have updated some linux packages so the environment is not 100% the same. I’ll post again if it ever happens.

       

    • #23241
      eikish
      Participant

      @wong I added a bunch of “working” examples with gst-launch-1.0 on the GitHub issue, please check it out. Would be great if you guys can look into what changes are needed on your side and/or Jetson-inference code to have your camera work out of the box with the downloaded neural networks on Jetson Nano.

      https://github.com/ArduCAM/MIPI_Camera/issues/44

    • #23315
      wong
      Moderator

      Hi @eikish ,

      Thank you for the information, I will check it out, but it will take some time, please be patient.

Viewing 10 reply threads
  • You must be logged in to reply to this topic.