banner

Experimenting Predictive AI Models:
Hand & Face Tracking
by Steve Wu | GIXD 503 Creative Prototyping | ACCD 11 December, 2025

Introduction

This is part 1 of the last project of GIXD 503. This week, we experimented with predictive AI models and built applications using ML5.js library. You can view the code for Hand & Face Tracking Keypoints, Pinch Gesture Detection and Banana island with Hand Control!. (Camera access required)

Trying out the models

There are several AI models introduced in this project, such as the image classification model Teachable Machine, and handpose/gesture/faceMesh tracking model ml5.

ML5.js library has incorporated these models to have them operate with P5.js library. Which is familiar to us from last few week.

With these models, I can now track my hand or face from a video feed in a P5.js sketch.

face and hand tracking in P5.js

The model provides quite detailed tracking data including 3D data points. I also gave it a try.

hand tracking with 3D landmarks

Gesture Detection

Next, I tried to add some behavior and incorporate them into my sketches.

I first tried having it recognize a pinch gesture.


if (hands[0]) {
    let hand = hands[0];
    let indexX = hand.index_finger_tip.x;
    let indexY = hand.index_finger_tip.y;
    let thumbX = hand.thumb_tip.x;
    let thumbY = hand.thumb_tip.y;
    push();
    fill(255, 255, 255);
    if (dist(indexX, indexY, thumbX, thumbY) < 50) {
    fill(0, 255, 0);
    }
    circle((indexX + thumbX) * 0.5, (indexY + thumbY) * 0.5, 20);
    pop();
}
                

By taking the coordinates of index finger tip and thumb tip, I can caculate the distance between them. If they appear less then 50px on screen, means I'm doing a pinch.

In this block of code I also drew a circle between my index finger tip and thumb tip, and have it change color from white to green when a pinch is detected.

Pinch gesture detected!

Adding to sketch

Lastly, I tried to incorporate the models in my previous sketches.

I thought of the Monkey Island sketch and do my experiments there. I first addjusted the system so that things would not disappear (dies) when I try to grab them.

Then I loaded the hand tracking model and created a pinch cursor between my index finger tip and thumb tip. With a new pinching boolean in each object class file. I can stop the object from moving on its own when it's being pinched. Now when I pinch near an object, I change their pinching to true, and have their pos set to the pinch cursor position.

Banana and Moneky being pinched

Next Up

Next I'm planning to try incorporating the face detection to make another fun application. I'll link it here when it's ready.

You can also access the sketches in this article here:
Hand & Face Tracking Keypoints
Pinch Gesture Detection
Banana island with Hand Control!