Zhu Wang

About Me

headshot

I am an incoming Assistant Professor in the Department of Computer Science at the University of New Mexico, starting in Fall 2025. Previously, I was a post-doctoral researcher at Future Reality Lab, New York University, after completing my PhD (advised by Prof. Ken Perlin) in Computer Science at NYU. My research interests span several areas including XR, HCI, robotics, and AI. More specifically, I have been working on: 1. VR-based Human balance assessment and rehabilitation with motion analysis, eye-tracking, and force-sensing technologies; 2. XR-based multi-participant collaboration and communication; 3. Interactions with mobile robots and drones; 4. Data-driven content generation and retrieval.


I am looking for prospective PhD students with critical thinking, self-motivation, research curiosity, and sufficient technical skills. Priority will be given to applicants with experience or strong interest in AI, Graphics, HCI, XR, or Robotics. In addition to prospective PhD students, my team also welcomes motivated undergraduate and master’s students who are interested in my research and look for opportunities to participate in research projects and publications. If you are interested in joining my team, please feel free to send me an email with your CV and a brief description of your background and research interests.

Selected Projects

† Equal contribution   * Equal advising   🏆 Best Paper Award
Project Thumbnail
Frequency analyses of postural sway demonstrate the use of sounds for balance given vestibular loss
Anat V Lubetzky, Maura Cosetti, Daphna Harel, Katherine Scigliano, Marlee Sherrod, Zhu Wang, Agnieszka Roginska, Jennifer Kelly

Purpose: To investigate how adults with unilateral vestibular hypofunction and healthy controls incorporate visual and auditory cues for postural control in an abstract visual environment.
Conclusions: Patients with vestibular hypofunction used sounds to reduce sway in a static abstract environment and were somewhat destabilized by it in a dynamic environment. This suggests that sounds, when played from headphones, may function as an auditory anchor under certain level of challenge and specific tasks regardless of whether it’s stationary or moving. Our results support that increased sway in middle frequencies reflects vestibular dysfunction.

RoboTecture: A Scalable Shape-changing Interface Using Actuated Support Beams
Yuhan Wang, Keru Wang, Zhu Wang*, Ken Perlin*

We introduce RoboTerrain, a cost-efficient and expandable shape-changing system which utilizes a self-lifting structure composed of modular robotics that actuate support beams. RoboTerrain generates dynamic surface displays and enclosures by modulating a grid of robotic units with linear movements, each with two actuators and four beams connecting to adjacent units. The modular design allows the structure to scale to different grid sizes and to be arranged in flexible layouts. The self-lifting nature of RoboTerrain makes it possible to utilize the space on both sides of the surface. The design of a sparse grid structure makes the system more efficient in simulating large-scale structures such as smart architecture, and the spaces between the beams enable objects to pass through the actuated surface for novel interactions. In this paper, we demonstrate a few prototypes with different layouts and validate the proof of concept.

🏆Generative Terrain Authoring with Mid-air Hand Sketching in Virtual Reality
Yushen Hu, Keru Wang, Yuli Shao, Jan Plass, Zhu Wang*, Ken Perlin*

We present our VR-based terrain generation and authoring system, which utilizes hand tracking and a generative model to allow users to quickly prototype natural landscapes, such as mountains, mesas, canyons and volcanoes. Via positional hand tracking and hand gesture detection, users can use their hands to draw mid-air strokes to indicate desired shapes for the landscapes. A Conditional Generative Adversarial Network trained by using real-world terrains and their height maps then helps to generate a realistic landscape which combines features of training data and the mid-air strokes. In addition, users can use their hands to further manipulate their mid-air strokes to edit the landscapes.

A Collaborative Multimodal XR Physical Design Environment
Keru Wang, Pincun Liu, Yushen Hu, Xiaoan Liu, Zhu Wang, Ken Perlin

Our collaborative XR system integrates physical and virtual design spaces, using video passthrough to superimpose visual modifications on physical objects. With features such as multimodal inputs, real-time physical object tracking, and object-based 3D annotation, it speeds up design iterations for digital prototyping involving with physical objects.

“Push-That-There”: Tabletop Multi-robot Object Manipulation via Multimodal 'Object-level Instruction'
Keru Wang, Zhu Wang, Ken Nakagaki, Ken Perlin

The system enables users to intuitively control a multi-robot system to manipulate objects on tabletop surfaces through various modalities like gestures, GUI, tangible manipulation, and speech. The multi-robot system then autonomously anc collectively executes the instructions using a generalizable control algorithm. This approach emphasizes object-level interaction, allowing users to work at a higher level without managing individual robot movements.

Project Thumbnail
A Spatial Audio System for Co-Located Multi-Participant Extended Reality Experiences
Yi Wu, Agnieszka Roginska, Keru Wang, Zhu Wang, Ken Perlin

This paper presents the ongoing development of a spatial audio system for co-located, multi-participant, extended reality (CMXR) experiences. By integrating spatial audio and informative auditory displays, the system can enhance the sense of immersion and presence among participants and facilitate collaboration.

Project Thumbnail
Decrease in Head Sway as a Measure of Sensory Integration Following Vestibular Rehabilitation: A Randomized Controlled Trial
Anat V Lubetzky, Daphna Harel, Santosh Krishnamoorthy, Gene Fu, Brittani Morris, Andrew Medlin, Zhu Wang, Ken Perlin, Agnieszka Roginska, Maura Cosetti, Jennifer Kelly

This study is to determine the extent to which sensory integration strategies via head sway, derived from a Head-Mounted Display (HMD), change in people with vestibular disorders following vestibular rehabilitation.

Asymmetrical VR for Education
Keru Wang, Zhu Wang, Ken Perlin

We present a system that utilizes hand-held devices for non-VR instructors, enabling them to explore VR content and interact with students who are fully immersed in VR. The instructor can observe the VR environment or switch between different students’ first-person views by using commonly available hand-held devices, such as smartphones and tablets. The instructor can also use hand-held devices to interact with the VR world itself. The students can see the real-time video stream of the physical environment as well as a video stream of the instructor.

Zero-shot multi-modal artist-controlled retrieval and exploration of 3d object sets
Kristofer Schlachter†, Benjamin Ahlbrand†, Zhu Wang, Ken Perlin, Valerio Ortenzi

Our approach allows for multi-modal conditional feature-driven retrieval through a 3D asset database, by utilizing a combination of input latent embeddings. We explore the effects of different combinations of feature embeddings across different input types and weighting methods.

Project Thumbnail
Insight into postural control in unilateral sensorineural hearing loss and vestibular hypofunction
Anat V Lubetzky, Jennifer L Kelly, Daphna Harel, Agnieszka Roginska, Bryan D Hujsak, Zhu Wang, Ken Perlin, Maura Cosetti

This pilot study aimed to identify postural strategies in response to sensory perturbations (visual, auditory, somatosensory) in adults with and without sensory loss.

Mixed Reality Collaboration for Complementary Working Styles
Keru Wang, Zhu Wang, Karl Rosenberg, Zhenyi He, Dong Woo Yoo, Un Joo Christopher, Ken Perlin

Our project combines immersive VR, multitouch AR, real-time volumetric capture, motion capture, robotically-actuated tangible interfaces at multiple scales, and live coding, in service of a human-centric way of collaborating.

Project Thumbnail
Contextual sensory integration training via head mounted display for individuals with vestibular disorders: a feasibility study
Anat V. Lubetzky, Jennifer Kelly, Zhu Wang, Marta Gospodarek, Gene Fu, John Sutera, Bryan D. Hujsak

The purpose of this study was to test the feasibility of a novel VR application (app) developed for a Head Mounted Display (HMD) to target dizziness, imbalance and sensory integration in a functional context for patients with vestibular disorders.

VRGaitAnalytics: Visualizing Dual Task Cost for VR Gait Assessment
Zhu Wang, Liraz Arie, Anat V Lubetzky, Ken Perlin

We present a low-cost novel VR gait assessment system that simulates virtual obstacles, visual, auditory, and cognitive loads while using motion tracking to assess participants’ walking performance. The system utilizes in-situ spatial visualization for trial playback and instantaneous outcome measures which enable experimenters and participants to observe and interpret their performance.

Walking Balance Assessment with Eye-tracking and Spatial Data Visualization
Zhu Wang, Anat Lubetzky, Ken Perlin

we present a novel walking balance assessment system with eye tracking to investigate the role of eye movement in walking balance and spatial data visualization to better interpret and understand the experimental data. The spatial visualization includes instantaneous in-situ VR replay for the gaze, head, and feet; and data plots for the outcome measures. The system fills a need to provide eye tracking and intuitive feedback in VR to experimenters, clinicians, and participants in real-time.

A Virtual Obstacle Course within Diverse Sensory Environments
Zhu Wang, Anat Lubetzky, Charles Hendee, Marta Gospodarek, Ken Perlin

We developed a novel assessment platform with untethered virtual reality, 3D sounds, and pressure sensing floor mat to help assess the walking balance and negotiation of obstacles given diverse sensory load and/or cognitive load.

Project Thumbnail
Head Mounted Displays for Capturing Head Kinematics in Postural Tasks
Anat V. Lubetzky, Zhu Wang, Tal Krasovsky

We investigated concurrent validity of head tracking of two Head Mounted Displays (HMDs), Oculus Rift and HTC Vive, vs. a gold-standard motion capture system (Qualisys). Our results generally support the concurrent validity of Oculus Rift and HTC Vive head tracking during static and dynamic standing tasks in healthy young adults. Specific task- and direction-dependent differences should be considered when planning measurement studies using these novel tools.

Project Thumbnail
A Virtual Reality Four-Square Step Test for Quantifying Dynamic Balance Performance in People with Persistent Postural Perceptual Dizziness
Moshe MH Aharoni, Anat V Lubetzky, Zhu Wang, Maya Goldman, Tal Krasovsky

We evaluated the feasibility of a novel paradigm for evaluation of dynamic balance within complex visual environments in people with PPPD. Results indicated that performance of the FSST-VR is feasible and did not aggravate symptoms for people with PPPD.

Project Thumbnail
Head mounted display application for contextual sensory integration training: design, implementation, challenges and patient outcomes
Anat V Lubetzky, Jennifer Kelly, Zhu Wang, Makan TaghaviDilamani, Marta Gospodarek, Gene Fu, Erin Kuchlewski, Bryan Hujsak

We developed a VR HMD application that allows patients to practice contextual sensory integration (C.S.I) while sitting, standing, turning or stepping within diverse scenes. This application can become an integral part of vestibular rehabilitation. In this pilot study, usability and preliminary outcomes were tested in a mixed-methods descriptive study. Recommendation for future research and clinical implementation are discussed.

Project Thumbnail
Virtual Environments for Rehabilitation of Postural Control Dysfunction
Zhu Wang, Anat Lubetzky, Marta Gospodarek, Makan TaghaviDilamani, Ken Perlin

We developed a novel virtual reality [VR] platform with 3-dimensional sounds to help improve sensory integration and visuomotor processing for postural control and fall prevention in individuals with balance problems related to sensory deficits, such as vestibular dysfunction (disease of the inner ear). The 3D game-like scenes make participants feel immersed and gradually exposes them to situations that may induce dizziness, anxiety or imbalance in their daily-living.