1. Home
  2. Docs
  3. Raspberry Pi Pico
  4. Arducam Pico4ML TinyML Dev Kit

Arducam Pico4ML TinyML Dev Kit

Introduction

Pico4ML is a microcontroller board based on RP2040 for on-device machine learning. It also packs a camera, microphone, IMU, and display to help you get started with TensorFlow Lite Micro, which has been ported to the RP2040.
We’ve included 3 pre-trained TensorFlow Lite Micro examples, including Person Detection, Magic Wand, and Wake-Word Detection. You can also build, train and deploy your models on it.

Specs

MicrocontrollerRaspberry Pi RP2040
IMUICM-20948
Camera ModuleHiMax HM01B0, Up to QVGA (320 x [email protected])
Screen0.96 inch LCD SPI Display (160 x 80, ST7735)
Operating Voltage3.3V
Input VoltageVBUS:5V+/-10%.VSYS Max:5.5V
Dimension51x21mm

Quick Start

We’ve provided some pre-built binaries that you can just drag and drop onto your Pico4ML to make sure everything is working even before you start writing your code.

Pre-trained Models

Wake-word detection

A demo where Pico4ML provides always-on wake-word detection on whether someone is saying yes or no, using its onboard microphone and pre-trained speech detection model.

Magic Wand (Gesture Detection)

A demo where pico4ml casts several types of spells in one of the following three gestures: “Wing”, “Ring” and “Slope”, using its IMU and pre-trained gesture detection model.

image 3

Person Detection

A demo where pico4ML predicts the probabilities of the presence of a person with a Himax HM01B0 camera module.

First Use

Wake-word Detection

  1. Click on the link: https://raw.githubusercontent.com/ArduCAM/pico-tflmicro/main/bin/micro_speech.uf2 The “micro_speech.uf2” file will be downloaded.
  2. Go grab your Raspberry Pi or laptop, then press and hold the BOOTSEL button on your Pico4ML while you plug the other end of the micro USB cable into the board.
  3. Release the button after the board is plugged in. A disk volume called RPI-RP2 should pop up on your desktop.
  4. Double-click to open it, and then drag and drop the UF2 file into it. The volume will automatically unmount and the screen should light up.
  5. Hold your Pico4ML closer and say “yes” or “no”. The screen will display the corresponding word.

Magic Wand (Gesture Detection)

  1. Click on the link: https://raw.githubusercontent.com/ArduCAM/pico-tflmicro/main/bin/magic_wand.uf2 The “magic_wand.uf2” file will be downloaded.
  2. Repeat the 2nd-5th steps mentioned in “Wake-word Detection” to light up the screen.
  3. Wave your Pico4ML quickly in a W (wing), O (ring), or L (slope) shape. The screen will display the corresponding mark.

Person Detection

  1. Click on the link: https://raw.githubusercontent.com/ArduCAM/pico-tflmicro/main/bin/person_detection_int8.uf2 The “person_detection_int8.uf2” file will be downloaded.
  2. Repeat the 2nd-5th steps mentioned in “Wake-word Detection ” to light up the screen.
  3. Hold your Pico4ML to capture images. The screen will display the image and the probabilities of the presence of a person.

Quick Pico Setup

If you are developing for Raspberry Pi Pico on the Raspberry Pi 4B, or the Raspberry Pi 400, most of the installation steps in this Getting Started guide can be skipped by running the setup script. You can get this script by doing the following:

git clone https://github.com/raspberrypi/pico-setup.git

Then run:

 pico-setup/pico_setup.sh

The script will:

  • Create a directory called pico
  • Install required dependencies
  • Download the pico-sdk, pico-examples, pico-extras, and pico-playground repositories
  • Define PICO_SDK_PATH, PICO_EXAMPLES_PATH, PICO_EXTRAS_PATH, and PICO_PLAYGROUND_PATH in your ~/.bashrc
  • Build the blink and hello_world examples in pico-examples/build/blink and pico-examples/build/hello_world
  • Download and build picotool (see Appendix B). Copy it to /usr/local/bin. • Download and build picoprobe (see Appendix A).
  • Download and compile OpenOCD (for debug support)
  • Download and install Visual Studio Code
  • Install the required Visual Studio Code extensions (see Chapter 6 for more details)
  • Configure the Raspberry Pi UART for use with Raspberry Pi Pico

Once it has run, you will need to reboot your Raspberry Pi,

sudo reboot

Person Detection

  • Download pico-tflmicro
git clone https://github.com/ArduCam/pico-tflmicro.git
  • Compile
cd pico-tflmicro
mkdir build 
cd build 
cmake ..
make
# or only make person_detection_int8

Then you will creat some files under pico-tflmicro/tflmicro/build/examples/person_detection path

BinDescription
person_detection_int8.uf2This is the main program of person_detection, which can be dragged onto the RP2040 USB Mass Storage Device.
person_detection_benchmark.uf2This is the benchmark program of person_detection, you can use it to test the performance of person_detection on pico.

Tips: If you don’t want to compile, you can use the above pre-built uf2 file, you only need to wire the hardware and download uf2 to the device.

Test Person Detection

AppDescription
person_detection_int8This is a person detection demo.
  • Hardware requirements
WR1

Learn more here: pico4ml-an-rp2040-based-platform-for-tiny-machine-learning

  • Load and run person_detection

The simplest method to load software onto a RP2040-based board is by mounting it as a USB Mass Storage Device. Doing this allows you to drag a file onto the board to program the flash. Go ahead and connect the Raspberry Pi Pico to your Raspberry Pi using a micro-USB cable, making sure that you hold down the BOOTSEL button to force it into USB Mass Storage Mode.

If you are logged in via ssh for example, you may have to mount the mass storage device manually:

$ dmesg | tail
[ 371.973555] sd 0:0:0:0: [sda] Attached SCSI removable disk
$ sudo mkdir -p /mnt/pico
$ sudo mount /dev/sda1 /mnt/pico

If you can see files in /mnt/pico then the USB Mass Storage Device has been mounted correctly:

$ ls /mnt/pico/
INDEX.HTM INFO_UF2.TXT

Copy your person_detection_int8.uf2 onto RP2040:

sudo cp examples/person_detection/person_detection_int8.uf2 /mnt/pico
sudo sync

View Output

The person detection example outputs some information through usb, you can use minicom to view:

minicom -b 115200 -o -D /dev/ttyACM0

The person detection example also outputs the image data and person detection results to the UART, and you can see them directly on the screen.

Micro Speech

  • Download pico-tflmicro
git clone https://github.com/ArduCAM/pico-tflmicro.git
  • Compile
cd pico-tflmicro
mkdir build 
cd build 
cmake ..
make
# or only make micro_speech

Then you will creat some files under pico-tflmicro/tflmicro/build/examples/micro_speech path

BinDescription
micro_speech.uf2This is the main program of micro_speech, which can be dragged onto the RP2040 USB Mass Storage Device.

Tips: If you don’t want to compile, you can use the above pre-built uf2 file, you only need to wire the hardware and download uf2 to the device.

Test Micro Speech

AppDescription
micro_speechThis is a micro speech demo.
  • Hardware requirements
WR1 1

Learn more here: pico4ml-an-rp2040-based-platform-for-tiny-machine-learning

  • Load and run micro_speech The simplest method to load software onto a RP2040-based board is by mounting it as a USB Mass Storage Device. Doing this allows you to drag a file onto the board to program the flash. Go ahead and connect the Raspberry Pi Pico to your Raspberry Pi using a micro-USB cable, making sure that you hold down the BOOTSEL button to force it into USB Mass Storage Mode.

If you are logged in via ssh for example, you may have to mount the mass storage device manually:

$ dmesg | tail
[ 371.973555] sd 0:0:0:0: [sda] Attached SCSI removable disk
$ sudo mkdir -p /mnt/pico
$ sudo mount /dev/sda1 /mnt/pico

If you can see files in /mnt/pico then the USB Mass Storage Device has been mounted correctly:

$ ls /mnt/pico/
INDEX.HTM INFO_UF2.TXT

Copy your micro_speech.uf2 onto RP2040:

sudo cp examples/micro_speech/micro_speech.uf2 /mnt/pico
sudo sync

View Output

The micro speech example outputs some information through usb, you can use minicom to view:

minicom -b 115200 -o -D /dev/ttyACM0

The micro speech example also outputs the results to the screen.

Magic Wand

  • Download pico-tflmicro
git clone https://github.com/ArduCAM/pico-tflmicro.git
  • Compile
cd pico-tflmicro
mkdir build 
cd build 
cmake ..
make
# or only make  magic_wand

Then you will creat some files under pico-tflmicro/tflmicro/build/examples/magic_wand path

BinDescription
magic_wand.uf2This is the main program of magic_wand, which can be dragged onto the RP2040 USB Mass Storage Device.

Tips: If you don’t want to compile, you can use the above pre-built uf2 file, you only need to wire the hardware and download uf2 to the device.

Test Magic Wand

AppDescription
magic_wandThis is a magic wand demo.
  • Hardware requirements
WR1 2

Learn more here: pico4ml-an-rp2040-based-platform-for-tiny-machine-learning

  • Load and run magic_wand

The simplest method to load software onto a RP2040-based board is by mounting it as a USB Mass Storage Device. Doing this allows you to drag a file onto the board to program the flash. Go ahead and connect the Raspberry Pi Pico to your Raspberry Pi using a micro-USB cable, making sure that you hold down the BOOTSEL button to force it into USB Mass Storage Mode.

If you are logged in via ssh for example, you may have to mount the mass storage device manually:

$ dmesg | tail
[ 371.973555] sd 0:0:0:0: [sda] Attached SCSI removable disk
$ sudo mkdir -p /mnt/pico
$ sudo mount /dev/sda1 /mnt/pico

If you can see files in /mnt/pico then the USB Mass Storage Device has been mounted correctly:

$ ls /mnt/pico/
INDEX.HTM INFO_UF2.TXT

Copy your magic_wand.uf2 onto RP2040:

sudo cp examples/magic_wand/magic_wand.uf2 /mnt/pico
sudo sync

View Output

The magic wand example outputs some information through usb, you can use minicom to view:

minicom -b 115200 -o -D /dev/ttyACM0

The magic wand example also outputs the results to the screen.

Was this article helpful to you? Yes No