1. Home
  2. Docs
  3. Raspberry Pi Pico
  4. Pico4ML Magic Wand Example Update with BLE Enabled

Pico4ML Magic Wand Example Update with BLE Enabled

Table Of Contents

1. Collecting Data for Your Custom Magic Wand Project

In this reading, we are going to collect custom gestures which we can then later use to train a custom magic wand project.

Step 1

Open up your browser and navigate to: https://www.oakchina.cn/pico/

a. If you see a warning that says:

Error: This browser doesn’t support Web Bluetooth. Try using Chrome.

If you are not using Chrome we suggest you switch to Chrome.

If you are already using Chrome then navigate to the following and make sure to enable the experimental web platform features. We have found this step to be necessary on older versions of Chrome and users running the Linux operating system.

chrome://flags/#enable-experimental-web-platform-features

6 1

 If this problem persists make sure you are navigating to https://.

For some reason, if you do not use the secure protocol then web Bluetooth will not work.

b. You should then arrive at a webpage that looks something like the following.

7 1

Step 2

You’ll then need to connect your device over Bluetooth. To do that simply click the blue Bluetooth button and a pop-up will appear asking to pair. Select your BLE Sense which should be called something like BT16, and click the pair button. Do note that the course staff has found that sometimes you have to repeat this step twice. Once you are connected the Bluetooth button will turn green.

Step 3

Once your Pico is paired it’s time to record some gestures. You’ll notice that every time you move the Pico around and then stop a new gesture is recorded. This is because the gestures are automatically split up by times when the wand is kept still. These pauses act like spaces between words, and so when you’ve finished a gesture you should stop moving the wand so that it ends cleanly. Note that the direction of the gesture matters (e.g., a clockwise circle is different from a counterclockwise circle)!

These gestures you are drawing will start to show up in the list on the right side of the screen. You can look at the shapes shown there to understand whether the gestures came out cleanly. A good rule of thumb is that if you can’t tell what the gesture is by looking at it, then a model will have a hard time recognizing it too. If you want to delete a recording simply click the trashcan icon at the top right of each gesture recording. (You may need to delete a lot of spurious recordings that were made as you moved the Pico into position between each gesture).

Also, make sure to label all of your gestures for training. You can label each recording by clicking on the question mark at the top left of each gesture and typing in your label. For example, the screenshot below shows the label Z added to the gesture of the letter Z recorded by the course staff.

The staff has found that collecting ~20 examples each of 2-3 different gestures often will be enough data to successfully train a moderately decent magic wand application (i.e., it will work often for you may not generalize to other users). To help you keep track of how many gestures you recorded there is a number in the top right of the screen (e.g., the number 1 as shown in the image below). The staff has also found that gestures like a circle (O) or the (Z) for Zoro tend to work quite well! Finally, you can upload multiple JSON files to the training script so don’t feel pressured to do all of your gesture recordings in one shot!

8

Step 4

When you are done collecting all of your data simply click the blue “Download Data” button and a JSON file with all of the gestures will be automatically downloaded! We’ll use that file in the Colab in the next section to train a custom model! Be careful, when you leave or refresh the web page, your recorded gestures will be lost, so make sure you use the “Download Data” link to save them!

2. Training and Deploying Your Custom Magic Wand Project

In this document, we are going to train and then deploy a custom magic wand model based on the custom gestures we just collected.

2.1. Training the Magic Wand Model

The first thing you’ll need to do is to upload your gesture dataset into Colab and train a new magic wand model. Then in that Colab, we’ll need to convert that model first into a quantized .tflite file and then into a .cc file for use with the Arduino IDE.

We will be using the resulting .cc file, so make sure to download it or leave the tab open with the printout!

https://colab.research.google.com/github/tinyMLx/colabs/blob/master/4-8-11-CustomMagicWand.ipynb

2.2. Deploying the Trained Model

Step 1

Use a USB cable to connect the Arducam Pico4ML to your machine.

Step 2

Download the latest version of Pico4ML_TensorFlowLite https://github.com/Arducam-team/arduino-library/releases and import it via Sketch → Include Library → Add .ZIP library…

Step 3

Open the magic_wand_lab.ino sketch, which you can find via the File drop-down menu. Navigate, as follows: File → Examples → Pico4ML_TensorFlowLite → magic_wand_lab.

Step 4

Navigate to the magic_wand_model_data.cpp file and update the model.

  • Copy the binary model file contents from the magic_wand.cc file into the magic_wand_model_data.cpp file. As always make sure to only copy the binary data inside the { } as the variable type is different in the downloaded or printed magic_wand.ccfile.
  • Next, scroll all the way down to the bottom of the file and replace the model length. Again note that the .cpp file needs the variable to be of type const int while the .cc file will show unsigned int. Our suggestion again is to simply copy the numerical value

Step 5

You’ll then need to make two changes to the magic_wand.ino file to alert it of your number of gestures and gesture labels. These changes occur on lines 48-52 which currently read as:

// ——————————————————————————– //
// UPDATE THESE VARIABLES TO MATCH THE NUMBER AND LIST OF GESTURES IN YOUR DATASET  //
// ——————————————————————————– //
constexpr int label_count = 10;
const char* labels[label_count] = {“0”, “1”, “2”, “3”, “4”, “5”, “6”, “7”, “8”, “9”};

  • Update the label_count to reflect the number of gestures in your dataset.
  • Update the list of labels to reflect the gestures in your dataset. Note the order matters! Make sure it matches the alphanumeric order as pointed out in the training script!

Step 6

When you save, you will be asked to save a copy. We suggest that you make a folder called e.g., TinyML inside of your Arduino folder. You can find your main Arduino folder either inside of your Documents folder or in your Home folder, and save it in that folder with a descriptive name like magic_wand_custom. That said, you can save it wherever you like with whatever name you want!

Step 7

As always, use the Tools drop-down menu to select the appropriate Port and Board.

  • Select the Arducam Pico4ML as the board by going to Tools → Board: → Raspberry Pi RP2040 Boards→ Raspberry Pi Pico. Note that on different operating systems the exact name of the board may vary but/and it should include the word Pico at a minimum.
  • Then select the USB Port associated with your board. This will appear differently on Windows, macOS, Linux but will likely indicate ‘Arducam Pico4ML” in parenthesis. You can select this by going to Tools → Port: <Current Port (Board on Port)> → <TBD Based on OS>(Arducam Pico4ML). Where <TBD Based on OS> is most likely to come from the list below where <#> indicates some integer number
    • Windows → COM<#>
    • macOS → /dev/cu.usbmodem<#>
    • Linux → ttyUSB<#> or ttyACM<#>

Step 8

Use the rightward arrow to upload/flash the code. Do not be alarmed if you see a series of orange warnings appear in the console. This is expected as we are working with bleeding-edge code. You’ll know the upload is complete when you read the text in the console at the bottom of the IDE that shows 100% upload of the code and a statement that says something like “Done in <#.#> seconds.”

If you receive an error you will see an orange error bar appear and a red error message in the console. Don’t worry – there are many common reasons this may have occurred.

Step 9

Now open the serial monitor and test out your custom model. As a reminder, the serial monitor will output first ASCII art showing the gesture you just performed, and below it will be the best match label as well as a confidence score between 0 and 100. The confidence score indicates how strongly the model believes you performed the gesture. Do note that every time you move the board and then stop a new gesture will be processed so don’t be surprised to get some odd results as you move the board to prepare for a gesture.

…………………………..
………####……………….
…….##….##……………..
……#……..#…………….
……#……..#…………….
……#………#……………
…..#………..#…………..
…..#…………#………….
…..#………….#…………
….#………………………
….#…………..#…………
….#……………#………..
….#……………#………..
…#…………….#………..
…#……………..#……….
…#………………##……..
…#……………..###……..
…#……………..##………
…#…………….##……….
…##……………#………..
….#…………..##………..
….#………..####…………
…..#………#.##………….
……###…###……………..
……..####………………..
…………………………..
…………………………..
…………………………..
…………………………..
…………………………..
…………………………..
…………………………..
Found 0 (67%)

Was this article helpful to you? Yes No