Strike a pose and Move Mirror will find images to match it

Web Log: To ensure privacy, Google says your image never leaves your laptop

When looking for certain images via Google one usually types in related keywords or, using Reverse Image Search, drags a picture into the search engine to see similar results. Google's new AI experiment Move Mirror allows you to search through a database of images by standing in front of your webcam and striking a pose.

Move Mirror finds images based on your movements; this works inside most web browsers. Once you back up so your whole body is in the frame, you can karate chop, do a handstand, freeze Gangnam Style, and the PoseNet machine learning tech built by Google will translate your image to joint data in x,y co-ordinates to find all similar poses in the image dataset. Once these images are found it is possible to create a GIF and share with friends.

To ensure privacy, Google says your image never leaves your laptop: the joint data is taken from your pose and this is what is compared to the 80,000 images in the dataset, which is, incidentally, curated by Move Mirror for quality (full body images) rather than searching everything indexed by Google’s search engine.

"We were tickled by the idea of being able to search an archive by pose. What if you could strike a pose and get a result that was the dance move you were doing," said Jane Friedhoff and Irene Alvarado, creative technologists at Google Creative Lab.

READ MORE

https://www.blog.google/technology/ai/move-mirror-you-move-and-80000-images-move-you/