<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>VR | Yang Gao</title><link>https://ygao36buffalo.github.io/tags/vr/</link><atom:link href="https://ygao36buffalo.github.io/tags/vr/index.xml" rel="self" type="application/rss+xml"/><description>VR</description><generator>Hugo Blox Builder (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Fri, 25 Apr 2025 00:00:00 +0000</lastBuildDate><item><title>From Wrist to Finger: Hand Pose Tracking Using Ring-Watch Wearables</title><link>https://ygao36buffalo.github.io/project/from-wrist-to-finger/</link><pubDate>Fri, 25 Apr 2025 00:00:00 +0000</pubDate><guid>https://ygao36buffalo.github.io/project/from-wrist-to-finger/</guid><description>&lt;h2 id="project-overview-chi-2025-late-breaking-work">Project Overview (CHI 2025 Late-Breaking Work)&lt;/h2>
&lt;p>Hand pose tracking is a cornerstone for advancing human-computer interaction applications, from virtual reality to prosthetics control. However, existing vision-based systems and wearable devices often face limitations in portability, usability, and practicality. In this &lt;strong>CHI 2025 Late-Breaking Work&lt;/strong>, we introduce a novel multimodal hand pose tracking framework that integrates data from an IMU-equipped ring and EMG sensors embedded in a wrist-worn device.&lt;/p>
&lt;p>This work represents an important step towards enabling more practical and accessible hand tracking solutions for everyday use.&lt;/p>
&lt;h2 id="key-innovations--progress">Key Innovations &amp;amp; Progress:&lt;/h2>
&lt;h3 id="1-novel-ring-watch-wearable-design">1. Novel Ring-Watch Wearable Design&lt;/h3>
&lt;p>We propose a compact and ergonomic wearable system that combines a single IMU-equipped ring (worn on the thumb) with EMG sensors integrated into a smartwatch. This design prioritizes wearability and comfort, addressing the common limitations of bulky or multi-device tracking systems while accurately capturing intricate finger and hand motion data.&lt;/p>
&lt;h3 id="2-multi-sensor-fusion-for-precise-tracking">2. Multi-Sensor Fusion for Precise Tracking&lt;/h3>
&lt;p>Our framework leverages the complementary strengths of both motion dynamics (from the IMU) and muscle activity (from the EMG sensors). This deep learning-based sensor fusion approach achieves precise 3D hand pose reconstruction, providing robust performance even in complex or high-speed gestures. We developed a transformer-based model with time encoding and cross-modal attention mechanisms for optimal data integration.&lt;/p>
&lt;h2 id="media--resources">Media &amp;amp; Resources:&lt;/h2>
&lt;ul>
&lt;li>&lt;strong>Paper:&lt;/strong> &lt;a href="https://doi.org/10.1145/3706599.3720220" target="_blank" rel="noopener">Extended Abstract (ACM DL)&lt;/a>&lt;/li>
&lt;li>&lt;strong>Video:&lt;/strong> &lt;a href="https://www.youtube.com/watch?v=g6bTCVMpNf4" target="_blank" rel="noopener">Project Video&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;p>Did you find this page helpful? Consider sharing it 🙌&lt;/p></description></item><item><title>SandTouch: Empowering Virtual Sand Art in VR with AI Guidance and Emotional Relief</title><link>https://ygao36buffalo.github.io/project/sandtouch/</link><pubDate>Fri, 25 Apr 2025 00:00:00 +0000</pubDate><guid>https://ygao36buffalo.github.io/project/sandtouch/</guid><description>&lt;h2 id="project-overview">Project Overview&lt;/h2>
&lt;p>In today&amp;rsquo;s data-rich world, simply having data isn&amp;rsquo;t enough – we need to understand it, communicate it, and extract insights from it. This course empowers you to transform raw numbers and complex datasets into compelling, insightful visual narratives. We&amp;rsquo;ll blend art with science, learning how to design interactive visualizations that not only look good but also enable powerful data exploration and decision-making. Get ready to become a data storyteller!&lt;/p>
&lt;p>Sand painting is a unique and valuable art form, but it&amp;rsquo;s often constrained by physical equipment and a steep learning curve. To address these challenges, we developed &lt;strong>SandTouch&lt;/strong>, a novel VR sand painting system that offers an immersive and intuitive experience, closely mirroring real-world sand manipulation. This project was presented at &lt;strong>CHI 2025&lt;/strong> (the premier conference in Human-Computer Interaction).&lt;/p>
&lt;h2 id="key-features--contributions">Key Features &amp;amp; Contributions:&lt;/h2>
&lt;h3 id="1-realistic-hand-sand-interaction">1. Realistic Hand-Sand Interaction&lt;/h3>
&lt;p>We designed SandTouch to create a highly realistic and natural interaction between the user&amp;rsquo;s hands and virtual sand, allowing direct manipulation without external devices or controllers. This approach provides an intuitive, device-free interface and emphasizes the authenticity of the interaction. Our system captures fine sensations of real sand manipulation, complemented by realistic sound feedback.&lt;/p>
&lt;h3 id="2-ai-guidance-for-creative-flow">2. AI Guidance for Creative Flow&lt;/h3>
&lt;p>A pioneering aspect of SandTouch is the integration of an AI agent, powered by a large language model (LLM). This AI intelligently interprets users&amp;rsquo; creative intentions in real-time, offering contextually relevant artistic suggestions. This feature simplifies the creative process, enhances interactivity, and helps users (especially beginners) refine their artwork through technique recommendations and composition analysis.&lt;/p>
&lt;h3 id="3-emotional-relief--immersion">3. Emotional Relief &amp;amp; Immersion&lt;/h3>
&lt;p>Beyond artistic creation, SandTouch prioritizes user well-being. It incorporates calming and responsive soundscapes that react to user gestures, reinforcing a relaxing atmosphere. Comprehensive evaluations demonstrated a significant increase in user engagement and immersion, with the realistic sound feedback enhancing emotional relief and deepening the painting experience. This highlights its potential for therapeutic applications.&lt;/p>
&lt;h2 id="media--resources">Media &amp;amp; Resources:&lt;/h2>
&lt;ul>
&lt;li>&lt;strong>Paper:&lt;/strong> &lt;a href="https://doi.org/10.1145/3706598.3714275" target="_blank" rel="noopener">Full Paper (ACM DL)&lt;/a>&lt;/li>
&lt;li>&lt;strong>Video:&lt;/strong> &lt;a href="https://www.youtube.com/watch?v=6FYOCeU0liw" target="_blank" rel="noopener">Project Video&lt;/a>&lt;/li>
&lt;/ul>
&lt;hr>
&lt;p>Did you find this page helpful? Consider sharing it 🙌&lt;/p></description></item></channel></rss>