<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Wearable Computing | Yang Gao</title><link>https://ygao36buffalo.github.io/tags/wearable-computing/</link><atom:link href="https://ygao36buffalo.github.io/tags/wearable-computing/index.xml" rel="self" type="application/rss+xml"/><description>Wearable Computing</description><generator>Hugo Blox Builder (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Fri, 25 Apr 2025 00:00:00 +0000</lastBuildDate><item><title>From Wrist to Finger: Hand Pose Tracking Using Ring-Watch Wearables</title><link>https://ygao36buffalo.github.io/project/from-wrist-to-finger/</link><pubDate>Fri, 25 Apr 2025 00:00:00 +0000</pubDate><guid>https://ygao36buffalo.github.io/project/from-wrist-to-finger/</guid><description>&lt;h2 id="project-overview-chi-2025-late-breaking-work">Project Overview (CHI 2025 Late-Breaking Work)&lt;/h2>
&lt;p>Hand pose tracking is a cornerstone for advancing human-computer interaction applications, from virtual reality to prosthetics control. However, existing vision-based systems and wearable devices often face limitations in portability, usability, and practicality. In this &lt;strong>CHI 2025 Late-Breaking Work&lt;/strong>, we introduce a novel multimodal hand pose tracking framework that integrates data from an IMU-equipped ring and EMG sensors embedded in a wrist-worn device.&lt;/p>
&lt;p>This work represents an important step towards enabling more practical and accessible hand tracking solutions for everyday use.&lt;/p>
&lt;h2 id="key-innovations--progress">Key Innovations &amp;amp; Progress:&lt;/h2>
&lt;h3 id="1-novel-ring-watch-wearable-design">1. Novel Ring-Watch Wearable Design&lt;/h3>
&lt;p>We propose a compact and ergonomic wearable system that combines a single IMU-equipped ring (worn on the thumb) with EMG sensors integrated into a smartwatch. This design prioritizes wearability and comfort, addressing the common limitations of bulky or multi-device tracking systems while accurately capturing intricate finger and hand motion data.&lt;/p>
&lt;h3 id="2-multi-sensor-fusion-for-precise-tracking">2. Multi-Sensor Fusion for Precise Tracking&lt;/h3>
&lt;p>Our framework leverages the complementary strengths of both motion dynamics (from the IMU) and muscle activity (from the EMG sensors). This deep learning-based sensor fusion approach achieves precise 3D hand pose reconstruction, providing robust performance even in complex or high-speed gestures. We developed a transformer-based model with time encoding and cross-modal attention mechanisms for optimal data integration.&lt;/p>
&lt;h2 id="media--resources">Media &amp;amp; Resources:&lt;/h2>
&lt;ul>
&lt;li>&lt;strong>Paper:&lt;/strong> &lt;a href="https://doi.org/10.1145/3706599.3720220" target="_blank" rel="noopener">Extended Abstract (ACM DL)&lt;/a>&lt;/li>
&lt;li>&lt;strong>Video:&lt;/strong> &lt;a href="https://www.youtube.com/watch?v=g6bTCVMpNf4" target="_blank" rel="noopener">Project Video&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;p>Did you find this page helpful? Consider sharing it 🙌&lt;/p></description></item><item><title>PressInPose: Integrating Pressure and Inertial Sensors for Full-Body Pose Estimation in Activities</title><link>https://ygao36buffalo.github.io/project/pressinpose/</link><pubDate>Thu, 21 Nov 2024 00:00:00 +0000</pubDate><guid>https://ygao36buffalo.github.io/project/pressinpose/</guid><description>&lt;h2 id="project-overview">Project Overview&lt;/h2>
&lt;p>Accurate human body posture assessment through wearable technology has significant implications across various fields, including sports science, clinical diagnostics, rehabilitation, and VR interaction. Traditional methods often face limitations due to complex setups or environmental constraints. To address these challenges, we developed &lt;strong>PressInPose&lt;/strong>, an innovative system that integrates pressure and inertial sensors for precise full-body pose estimation in dynamic activities. This work was published in &lt;strong>Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT)&lt;/strong> and will be presented at &lt;strong>UbiComp 2025&lt;/strong>.&lt;/p>
&lt;h2 id="key-innovations--contributions">Key Innovations &amp;amp; Contributions:&lt;/h2>
&lt;h3 id="1-novel-multi-sensor-fusion">1. Novel Multi-Sensor Fusion&lt;/h3>
&lt;p>PressInPose employs an advanced shoe insole embedded with pressure sensors and an Inertial Measurement Unit (IMU), coupled with a single wrist-mounted IMU. This unique multi-modal sensor fusion approach allows for a comprehensive analysis of human biomechanics, capturing intricate body dynamics that traditional single-sensor systems often miss.&lt;/p>
&lt;h3 id="2-llm-powered-virtual-data-augmentation">2. LLM-Powered Virtual Data Augmentation&lt;/h3>
&lt;p>To enhance the robustness and generalization of our system, we leveraged large language models (LLMs) to generate virtual human motion sequences. These sequences were utilized to create synthetic IMU data for data augmentation, effectively addressing the challenge of limited real-world data availability and variability, especially for complex and dynamic movements.&lt;/p>
&lt;h3 id="3-physical-kinematics-modeling--deep-learning-network">3. Physical Kinematics Modeling &amp;amp; Deep Learning Network&lt;/h3>
&lt;p>Our approach uniquely combines physical kinematics modeling based on pressure data with a multi-region human posture estimation network. This integration allows PressInPose to accurately capture interactions and dependencies between different body parts, leading to superior accuracy in pose reconstruction.&lt;/p>
&lt;h2 id="media--resources">Media &amp;amp; Resources:&lt;/h2>
&lt;ul>
&lt;li>&lt;strong>Paper:&lt;/strong> &lt;a href="%60https://doi.org/10.1145/3699773%60">Full Paper (ACM DL)&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;p>Did you find this page helpful? Consider sharing it 🙌&lt;/p></description></item></channel></rss>