Finger Detection

Cover

The Artificial Intelligence Learning (AI) + Hardware Learning Kit

Cover

Instruction and resources

Project Overview

In this project, we will use MediaPipe to identify the positions of 21 hand landmarks, calculating whether fingers are bent or straight to determine the total number of fingers displayed. The system processes this data in real-time and sends the results via serial communication to a connected Pico microcontroller to control LED lights. This functionality is integrated with an external hardware setup, where LEDs light up to reflect the detected finger count. The project demonstrates the seamless combination of computer vision, embedded systems, and real-time interaction.

Hardware

The following components will be prepared in the kit for this project:

  • A Development Board: EIM STEPico

  • A Breadboard

  • 10 LED lights (5 colors)

  • Multiple jumper wires

  • 1 USB cable (USB-A to USB-C)

Breadboard, STEPico, LED lights, jumper wires, and USB cable

Tutorial

This project comes with a detailed 60-page tutorial packed with step-by-step instructions and helpful illustrations. In this chapter, we will advance by developing a real-time application using a pre-trained CNN recognition model. To enhance the practicality of this project, we will also introduce some principles and uses of microcontrollers.

The Learning Part

Hardware Setup

  1. Place your STEPico at one end of the breadboard. It would be convenient for following procedures to align the first pair of pins to the first row of power slots.

Insert STEPico
  1. Insert the LEDs into the breadboard. Make sure each LED’s anode (long leg) is connected to a separate row.

Adding LED lights and jumper wires
  1. Connect the STEPico GND pin and the ground. Now, the circuit is ready for the next step.

Connect the ground and one STEPico GND pin

Microcontroller and Python

Once your board hardware has been completed, it's time to explore how each component and LED lights interface with each other, and the best tool to appreciate all the controls is through a Microcontroller. . The

In the project, we use Raspberry RP2040 based microcontroller, and in the tutorial we have two sections that provide walk-through codes, guiding you through the foundational steps and acquainting you with microcontroller configurations and micro-Python programming.

Check out this page for more information to setup your STEPico and Micropython: STEPico & Micropython

Setting Up Your Development Environment

To start programming on the Cursor IDE for your Raspberry RP2040 microcontroller project, ensure you have Python version 3.9 or 3.10 installed. You'll need to set up your development environment with the following packages:

pip3 install opencv-python mediapipe pyserial

To set up a virtual environment in Cursor IDE, create a new terminal and use this command to initiate a virtual environment:

python3 -m venv venv

The second "venv" will be the folder name for your virtual environment, it could be replaced by other name. To activate this virtual environment:

For Windows OS:

venv\Scripts\Activate.ps1

For Mac OS:

source venv/bin/activate

When setting up virtual environment on a Windows OS, its safety policy might prevent executing Activate.ps1 file to activate the virtual environment. You could use global environment to avoid changing safety policy but it might have difficulties in version control.

Some libraries will automatically bring in dependencies like math, numpy, and time, crucial for handling finger movement data in this project. Below is the modules for this Python scripts:

import cv2
import mediapipe as mp
import math
import numpy as np
import time
import serial

By following these steps, you'll be well-equipped to manage your project's hardware and software integration effectively.

Video Walkthrough

Last updated

Was this helpful?