Google has launched a new AI Experiment, Move Mirror, which tracks your movements and finds still photos that match your poses. It uses a catalog of over 80,000 images, including shots of people cooking, doing karate and skiing, and displays them in real time, creating a flipbook-style effect.
Grant the Google Mirror website permission to access your webcam, then start throwing some shapes, save the animation as a GIF, and share it online.
Move Mirror is an interesting demonstration of what Google calls pose estimation: tracking a person’s body as they move in 3D space. It’s a tricky task – people come in a huge range of shapes and sizes, there are often inanimate objects and other individuals in the frame, and some people use assistive devices like wheelchairs and crutches, all of which makes it difficult to gauge the position of their limbs.
Strike a pose, there’s nothing to it
Motion capture suits and infrared technology work well, but the need for dedicated hardware means they aren’t always practical. and they’re tough for the average developer to implement in their own apps.
Move Mirror demonstrates how machine learning models can be run right in a web browser, understanding and inferring data from an ordinary webcam or mobile phone camera. It has lots of potential uses, including teaching dance moves and learning home yoga workouts, and no data is sent to third-party servers, users can pose in privacy.
Move Mirror is part of Google AI Experiments – a collection of videos, games and interactive tools that demonstrate how machine learning works, and what it’s capable of.
Other Experiments include AutoDraw, which invites you to sketch an object, then matches it to a piece of clipart, and AI Duet, which takes melodies played on a virtual piano, and interprets and embellishes them to create a response.