Skip to main content
Three.jsMediaPipeWebGLgesture controltutorial3Dinteractive

Gesture-Controlled Particle System: A Three.js + MediaPipe Tutorial

Build an interactive gesture-controlled 3D particle system using Three.js and MediaPipe Hands. Wave your hand and watch thousands of particles snap into shapes in mid-air.

The AI CowboysAI Cowboys Team3 min read
3D particle system responding to hand gestures in a web browser

Wave Your Hand. Watch Particles Obey.

Imagine waving your hand and watching thousands of glowing particles snap into a sphere, cube, or logo in mid-air. That is what you will build in this tutorial — a gesture-controlled 3D particle system you can control with your fingers, using Google AI Studio as a coding copilot.

No installs. No frameworks. Just one HTML file and a browser.

What You Will Build

A browser-based application that:

  • Shows a small mirrored webcam preview in the bottom-right corner
  • Creates a particle system of 6,000+ points in Three.js
  • Uses finger count as gestures to switch particle formations
  • Responds in real time — 1 finger shows a rotating sphere, 2 fingers a cube, and so on
  • The Tech Stack

    Everything runs from CDN — no build step required:

  • Three.js (from CDN) — handles all 3D rendering via WebGL
  • MediaPipe Hands + camera_utils (from CDN) — tracks one hand from your webcam in real time
  • Vanilla JavaScript — no frameworks, no bundlers, no complexity
  • Under the Hood

    The hand tracking is powered by Google's MediaPipe Hands model running entirely in the browser. No data leaves your machine. The model detects 21 hand landmarks per frame and we count extended fingers to determine which gesture the user is making.

    The 3D graphics are pure Three.js and WebGL. Each particle is a point in a BufferGeometry, and we lerp (linearly interpolate) positions between formations to create smooth shape transitions when gestures change.

    How to Run It

  • Save the file as index.html in a folder (e.g., gesture-particles/)
  • Open that folder
  • Double-click index.html to open it in Chrome or any modern browser
  • Allow camera access when prompted
  • Hold up fingers to switch formations
  • That is it. No server, no build process, no dependencies to install.

    Gesture Mappings

    | Fingers | Formation | |---------|-----------| | 0 (fist) | Random scatter | | 1 | Rotating sphere | | 2 | Cube | | 3 | Helix / spiral | | 4 | Torus | | 5 (open hand) | Logo / custom shape |

    Customization

    The system is designed to be modified. Some ideas:

  • Replace the logo shape with a particle rendering of your company logo
  • Optimize for lower-end hardware by reducing particle count while keeping the visual effect smooth
  • Add color transitions that change based on gesture type
  • Integrate audio reactivity so particles respond to music as well as gestures
  • You can use Google AI Studio or any AI coding assistant to rapidly iterate on customizations — paste the HTML, describe what you want changed, and get updated code back.

    Why This Matters

    Gesture-controlled interfaces are moving from research labs to production applications. The combination of browser-based hand tracking (MediaPipe) and GPU-accelerated 3D rendering (Three.js/WebGL) makes it possible to build experiences that previously required specialized hardware and native applications.

    For product demos, trade show installations, interactive presentations, and educational tools, gesture-controlled particle systems demonstrate what is possible when AI meets creative technology — directly in the browser, with zero friction.

    Explore our AI solutions or visit our GitHub to see the code in action.