Department of Computer Science and Engineering (Artificial Intelligence and Machine Learning), ACE Engineering College, Ghatkesar, Telangana, India.
World Journal of Advanced Research and Reviews, 2026, 30(01), 844-852
Article DOI: 10.30574/wjarr.2026.30.1.0860
Received on 25 February 2026; revised on 04 April 2026; accepted on 07 April 2026
In this project, we designed and built a real-time hand tracking and gesture visualization system that integrates Google’s MediaPipe framework with the visual programming environment TouchDesigner. Our system captures live video from a standard USB or built-in webcam, uses MediaPipe’s hand-landmark detection model to extract 21 skeletal keypoints per hand, and transmits the resulting coordinate data over a local OSC (Open Sound Control) channel to TouchDesigner, where it drives GPU-accelerated visual effects. We observed that the system sustains an end-to-end processing delay below 35 milliseconds and a consistent frame rate of 28–30 FPS on mid-range consumer hardware, satisfying the real-time interaction threshold defined in HCI literature. Gesture recognition across five canonical gestures reached 94.4% accuracy under controlled lighting. This work demonstrates the practicality of low-cost, marker-less hand tracking as an input modality for augmented reality (AR), virtual reality (VR), interactive digital art, and contactless interfaces. Full experimental results, system architecture, implementation details, and future research directions are presented.
Hand Tracking; Gesture Recognition; MediaPipe; Computer Vision; TouchDesigner; HCI
Preview Article PDF
Atul Kumar Ramotra, Srishylam Bandi, Tharun Ballu and Koushik Karnati. Real-time hand tracking visualization using media pipe and touch designer. World Journal of Advanced Research and Reviews, 2026, 30(01), 844-852. Article DOI: https://doi.org/10.30574/wjarr.2026.30.1.0860.