We designed a serious game, RedCapes, and recruited 9 children with ASD and 6 TD children to evaluate the game.
We designed a 360 VR videos-based system to depict real-life scenarios from a refugee camp with active and passive navigation.
The study aims to use virtual humans to teach children with autism to express themselves through a conversational manner.
StarRescue is a two-player collaborative tablet game that aims to enhance autistic children’s social skills.
This study explores the privacy issues among families with autism by interviewing parents of children with autism.
The goal of this study is to use machine learning algorithms to measure, predict and design interventions for stress and anxiety.
We explore the entangled habitation design concept to advocate a more-than-human perspective in human-nature interaction design.
A qualitative study based on semi-structured interviews with 20 older adults and 18 younger ones in China.
This project aims to design and generate virtual pets (VPs) based on their personality and appearance traits.
Design and Evaluation of a Virtual Reality Serious Game for Promoting Understanding towards People with Color Vision Deficiency
This study aims to generate a unified definition of Jubensha, and investigate its impact on social interactions and players’ emotion through various mechanisms.
Enhancing Children’s Geography Learning with Tangible Augmented Reality Interaction.
Exploring Social Interactions with Collocated People and Connections with Surrounding Environment in an Augmented Reality Context.
“I'm Not Confident in Debiasing AI Systems Since I Know Too Little”: Teaching AI Creators About Gender Bias Through Hands-on Tutorials.
Curating a Socially Engaging and Informative Virtual Exhibition via Social Virtual Reality
Probing the Unique Strategies Users Take to Communicate in the Context of Mirrors in Social Virtual Reality
Exploring Designers’ Collaboration with Generative AI in a Multi-stakeholder Design Process: Take the Domain of Avatar Design as an Example
Pilgrimage to Pureland: Art, Perception and the Wutai Mural VR Reconstruction
"Voices Help Correlate Signs and Words":Analyzing Deaf and Hard-of-Hearing (DHH) TikTokers’ Content, Practices, and Pitfalls
GenRole: Leveraging Generative AI to Personalize Role Play for Social Skill Development in Autistic Children
Parental Perceptions of Children's d/Deaf Identity Shaping Technology Use: A Qualitative Study on Communication Technologies in Mixed-hearing Families
WooGu: an educational system that integrates tangible user interfaces, digital feedback, and embodied interaction to enhance children's understanding of farming processes.
Generative AI assisted practitioner's personalized language intervention for autistic children.
We developed a system to transcribe nonverbal emotion cues in speech into emojis, to help DHH individuals better understand implicit information under transcribed texts.
we proposed a comprehensive pipeline that leverages generative AI to automate the mural recreation process, and did a user study comparing the AI-generated scene with a hand-crafted one.
This study examines how NPC information cards in VR enhance emotional engagement and reduce VR sickness, while supporting a balance between exploration and guidance in cultural heritage experiences.
This study examines the challenges of integrating technology into non-pharmacological interventions for dementia.