chuoling.github.ioHands MediaPipe Hands is a high-fidelity hand and finger tracking solution. It employs machine learning (ML) to infer 21 3D landmarks of a hand from just a single frame.更多内容请查看https://chuoling.github.io/mediapipe/solutions/hands.html
arXiv.org[2006.10214] MediaPipe Hands: On-device Real-time Hand Tracking 2020年6月18日 · We present a real-time on-device hand tracking pipeline that predicts hand skeleton from single RGB camera for AR/VR applications. The pipeline consists of two models: 作者: Fan Zhang, Valentin Bazarevsky, Andrey Vakunov, Andrei Tkachenka, George Sung, Chuo-Ling Chang, MattCite as: arXiv:2006.10214 [cs.CV]Publish Year: 2020更多内容请查看https://arxiv.org/abs/2006.10214
Google Researchhttps://research.google/blog/on-device-real-timeOn-Device, Real-Time Hand Tracking with 2019年8月19日 · 3D hand perception in real-time on a mobile phone via MediaPipe. Our solution uses machine learning to compute 21 3D keypoints of a hand from a video frame. Depth is indicated in grayscale. Our hand tracking 更多内容请查看https://research.google/blog/on-device-real-time-hand-tracking-with-mediapipe/
Google ResearchMediaPipe Hands: On-device Real-time Hand TrackingWe present a real-time on-device hand tracking pipeline that predicts hand skeleton from only single camera input for AR/VR applications. The pipeline consists of two models: 1) a palm 更多内容请查看https://research.google/pubs/mediapipe-hands-on-device-real-time-hand-tracking/