top of page

Hi, I'm Wooje Chang (he/him/his).
It rhymes with "blue jay".

wooje_chang.png

I'm a Human-Computer Interaction researcher with a focus on physical devices.

Previously, I worked with Dr. Andrea Bianchi at KAIST, doing research primarily on (large-scale) haptics. 

Before KAIST, I majored in MechE and CogSci at Yale University.

KakaoTalk_20230630_001736354.png

Rendering Perceived Terrain Stiffness in VR Via Preload Variation Against Body-Weight

Wooje Chang, Seungwoo Je, Michel Pahud, Mike Sinclair, Andrea Bianchi | IEEE Transacptions on Haptics 2023

PreloadStep is a novel platform that creates the illusion of walking on different types of terrain in Virtual Reality without requiring users to wear any special instrumentation. PreloadStep works by compressing a set of springs between two plates, with the amount of compression determining the perceived stiffness of the virtual terrain. The platform can render perception of stiffness by applying preload forces up to 824 N in different portions of the terrain, capable of changing stiffness illusion even while a user is standing on it. The effectiveness of PreloadStep was tested in two perception studies (perception thresholds and haptic-visual congruence studies) and an example application, with the results indicating that it is a promising method for creating engaging virtual terrain experiences.

spinocchio.png

SpinOcchio: Understanding Haptic-Visual Congruency of Skin-Slip in VR with a Dynamic Grip Controller

Myung Jin Kim, Neung Ryu, Wooje Chang, Michel Pahud, Mike Sinclair, Andrea Bianchi | CHI 2022

This paper’s goal is to understand the haptic-visual congruency perception of skin-slip on the fingertips given visual cues in Virtual Reality (VR). We developed SpinOcchio (Spin for the spinning mechanism used, Occhio for the Italian word “eye”), a handheld haptic controller capable of rendering the thickness and slipping of a virtual object pinched between two fingers. This is achieved using a mechanism with spinning and pivoting disks that apply a tangential skin-slip movement to the fingertips. With SpinOcchio, we determined the baseline haptic discrimination threshold for skin-slip, and, using these results, we tested how haptic realism of motion and thickness is perceived with varying visual cues in VR. Surprisingly, the results show that in all cases, visual cues dominate over haptic perception. Based on these results, we suggest applications that leverage skin-slip and grip interaction, contributing further to realistic experiences in VR.

gsimeta.jpg

Design of Virtual Reality Application for Interaction Prototyping Remote Education
Best Paper Award

Hye-Young Jo, Wooje Chang, Hoonjin Jung, Andrea Bianchi | HCI Korea 2022

The COVID-19 pandemic has impacted education, especially in the STEAM subjects, such as Interaction Prototyping (a course involving physical computing), where physical practice is crucial. There have been studies to introduce virtual environments in STEAM education before the pandemic. However, the non-face-to-face education paradigm that emerged after the outbreak of the epidemic further increased this necessity. In this paper, we propose virtual reality applications for interaction prototyping remote education that provide an intuitive and safe practice environment for students. First, we summarize the flow of the interaction prototyping class and explore the difficulties in the class before and after COVID-19 through expert interviews. Based on this, we derive design considerations when converting the background of the interaction prototyping class from offline to the virtual environment. Finally, we propose four possible interaction scenarios that can provide students with an immersive experience: realistic theory class, 3D library, circuit assembly, and mixed reality practice.

ProjecString: Turning an Everyday String Curtain into an Interactive Projection Display

Wooje Chang, Yeeun Shin, Yeon Soo Kim, Woohun Lee | SIGGRAPH 2022 Posters

We present ProjecString, a touch-sensitive string curtain projection display that encourages novel interactions via touching, grasping, and seeing and walking through the display. We embed capacitive-sensing conductive chains into an everyday string curtain, turning it into both a space divider and an interactive display. This novel take on transforming an everyday object into an interactive projection surface with a unique translucent property creates novel interactions that are both immersive and isolating.

KoreanEmoticons2.png

Korean Emoticons: Understanding How Korean People Perceive Emotions in Text

Chowon Kang, Jong-ok Hong, Wooje Chang, Hyeon-Jeong Suk, Hwajung Hong | CSCW 2022 Posters

Online conversations through text have limitations in expressing emotions that can cause miscommunications across cultures. In this work, we study the Korean emotional expressions in text focusing on how people perceive emotional intentions through the use of emotion-expressing Korean characters. We define them as Korean emoticons (`ㅋ', `ㅎ', `ㅠ'), onomatopoeic characters often used to express emotions for text-based communication. We examine the participants' understanding and usage of Korean emoticons by conducting an online survey asking to evaluate emotional contents of given sentences and interviews to explain personal experiences. We found that the different numbers of Korean emoticons used evoke different emotions, and that negative emoticons amplify positive emotions in positive contexts and positive emoticons alleviate negative emotions in negative contexts, while emoticons in neutral contexts have varying impacts depending on the context. We further discuss design implications on how text suggestion tools can support users taking emotional intentions into account. 

bottom of page