Wio Terminal Chirping birds detection using machine learning: Audio classification

This tutorial describes how to build a machine learning model to detect chirping birds using Wio Terminal and Edge Impulse. We have already covered how to use machine learning to classify animal sounds using Arduino RP2040, but in this tutorial, we want to apply the knowledge we acquired in the previous post and apply it to Wio Terminal. To simplify the steps to build the machine learning, we will use Edge Impulse with a Colab script. Some steps that will be described here are already covered in the previous post therefore we will not cover them again. As we did previously, we will use Environmental Sound Classification 50. This is a classification problem where we have to classify a sound and determine if it belongs to the chirping birds class or to the noise class.

This project is different from others because the Edge Impulse library does not support Wio Terminal natively, therefore it is necessary to modify the code before using it.

How to build the machine learning model using Edge Impulse

The first step is creating a new project in Edge Impulse where we will upload our dataset. Once you have done it, go to the Github repository and open it in Colab. The script accomplishes the following tasks:

  1. Download the Enivornmental Sound classification 50 dataset
  2. Unzip it
  3. Import pandas library to manipulate the CSV file used to describe the dataset
  4. Download the noise dataset
  5. Upload it to Edge Impulse

The only things you have to do it upate the following lines:

  • KAGGLE_KEY that holds the key to download the dataset
  • api_key that holds the project key to upload the dataset to Edge Impulse

At the end of this phase, we will have two different classes:

  • chirping birds WAV files
  • noise WAV file

After the upload is completed, you should split the WAV file using the Split sample feature for each WAV belonging to the chirping birds class.

Creating the machine learning model

Now let us build the model:

  • MFE block as processing block
  • Neural Network (Keras)

Next, extract features from the previous samples and then let us train the model using the default parameters. When the training process completes you should have a result like the picture shown below:

Wio Terminal machine learning sound classification chirping birds

The interesting aspect is that we can build a machine learning model in a few minutes!! Now it is time to download the library and import it into the Arduino IDE.

How to run machine learning model with Wio Terminal to classify sounds

In this last step, it is necessary to modify the default code built by Edge Impulse and adapt it to Wio Terminal. We will not cover the theory behind this code but if you are curious there is a great article by Seeed Studio describing everything in detail.

Once you have imported the library into your IDE, open the “Continous….” and then replace the content with the content of this GitHub file.

Next, replace the following line:

#include <audio_scene_mfe_conv1d_v1_inference.h>Code language: CSS (css)

with:

#include <Wio_Terminal_Chirping_birds_inferencing.h>Code language: CSS (css)

Notice that if you want to show the result using Wio Terminal built-in display you should modify the enum too.

You are ready to run the machine learning model on your Wio Terminal. In this way, Wio Terminal can recognize chirping birds.

More useful content:

Wrapping up

At the end of this tutorial, we have discovered how to build a machine learning model using Edge Impulse that can be used with Wio Terminal to recognize chirping birds. We have covered how to use Colab script to create a custom dataset holding the chirping birds WAV files and then how to build a machine learning model. Using the code made available by Seeed Studio we can modify the default Edge Impulse code to support the Wio Terminal too.

  • Add Your Comment