Hello! I am the creator of this experiment.

Glad to see the conversation about hand tracking in the browser over here.

This demos is done under the context of a series of creative experiments on how to use real time hand tracking in the browser for creative interactions. Will be posting more experiments soon.

Tech background: I am using MediaPipe to control the hand rig in threejs. MediaPipe provides landmarks that are used to control a threejs Skeleton (hierarchy of bones with rotations).

Feel free to ask, I will answer any questions!

the hand tracking is spot on by the small image in the bottom left hand corner. steady and accurate positioning of two hands. :-)

unfortunately the rendering of the hands in the large window jumps all over the place on firefox, ubuntu, razer laptop. :-(

Cool progress, really nice work!

Not quite high enough fidelity to handle ASL though.

Some issues I ran into testing it, if it's something that interests you:

- Cannot distinguish closed vs. open fingers (always adds gaps between fingers, even if they're touching) (B) - Can't handle crossed fingers (R) - Doesn't seem to like extended vs. curled fingers in some cases (H) - Other failed letters: (Q), (E?), (F?), (G), (Q), (S), (U/V)

But, when signing naturally, it seems to get enough of the shapes and orientation correct enough to understand what I'm seeing. I'm sure there's things it'd trip up on because of some of the above weaknesses in detecting hand shapes but it does seem to get movement, orientation, and position "good enough".

Doesn't really work well while holding objects or faster movements, which I imagine would be restrictive for gaming or simulation purposes. Might be useful as a replacement for Leap Motion, though. I can see this working for manipulating a desktop environment.
This is so awesome. Fantastic demo.

An "air-piano" seems well within the realms of possibility now.

This is unfortunately a bit buggy. I can see in the 2D image that it is tracking when my hand is turned backwards and relatively flat to the camera. But the 3D is showing my hands are curled.
This is really cool to see. I don't have any problems or bugs using it, but it seems like the 3d rendering of the virtual hands is not quite as fast as the tracking itself. I wonder if and how this could be used in a meaningful way. As a feature for things like Google Quick Draw it would be fun.
Cool mashup! For anyone interested, I found this codepen from Google where you can play with Mediapipe in your browser: https://codepen.io/mediapipe/pen/RwGWYJw
Really well done! I've taken a stab at this problem with mixed results. This is leaps and bounds beyond most of my attempts. Thanks for sharing!
It doesn't allow me to select which camera to use unfortunately, Chrome is set to use one but the site probably uses the first one it finds.
I guess one could make a fun therminvox game out of this.
Wow this works much better than I had expected. Well done!
Jack, it's amazing. Very impressive stuff.
fails when both hands touch or cross