エピソード

  • CSCW 2024: Situating Empathy in HCI/CSCW: A Scoping Review
    2024/12/02

    Uğur Genç and Himanshu Verma. 2024. Situating Empathy in HCI/CSCW: A Scoping Review. Proc. ACM Hum.-Comput. Interact. 8, CSCW2, Article 513 (November 2024), 37 pages. https://doi.org/10.1145/3687052

    Empathy is considered a crucial construct within HCI and CSCW, yet our understanding of this complex concept remains fragmented and lacks consensus in existing research. In this scoping review of 121 articles from the ACM Digital Library, we synthesize the diverse perspectives on empathy and scrutinize its current conceptualization and operationalization. In particular, we examine the various interpretations and definitions of empathy, its applications, and the methodologies, findings, and trends in the field. Our analysis reveals a lack of consensus on the definitions and theoretical underpinnings of empathy, with interpretations ranging from understanding the experiences of others to an affective response to the other's situation. We observed that despite the variety of methods used to gauge empathy, the predominant approach remains self-assessed instruments, highlighting the lack of novel and rigorously established and validated measures and methods to capture the multifaceted manifestations of empathy. Furthermore, our analysis shows that previous studies have used a variety of approaches to elicit empathy, such as experiential methods and situational awareness. These approaches have demonstrated that shared stressful experiences promote community support and relief, while situational awareness promotes empathy through increased helping behavior. Finally, we discuss a) the potential and drawbacks of leveraging empathy to shape interactions and guide design practices, b) the need to find a balance between the collective focus of empathy and the (existing and dominant) focus on the individual, and c) the careful testing of empathic designs and technologies with real-world applications.

    https://dl.acm.org/doi/10.1145/3687052

    続きを読む 一部表示
    40 分
  • ICMI 2024 Exploring the Alteration and Masking of Everyday Noise Sounds using Auditory Augmented Reality
    2024/11/18

    Isna Alfi Bustoni, Mark McGill, and Stephen Anthony Brewster. 2024. Exploring the Alteration and Masking of Everyday Noise Sounds using Auditory Augmented Reality. In Proceedings of the 26th International Conference on Multimodal Interaction (ICMI '24). Association for Computing Machinery, New York, NY, USA, 154–163. https://doi.org/10.1145/3678957.3685750

    While noise-cancelling headphones can block out or mask environmental noise with digital sound, this costs the user situational awareness and information. With the advancement of acoustically transparent personal audio devices (e.g. headphones, open-ear audio frames), Auditory Augmented Reality (AAR), and real-time audio processing, it is feasible to preserve user situational awareness and relevant information whilst diminishing the perception of the noise. Through an online survey (n=124), this research explored users’ attitudes and preferred AAR strategy (keep the noise, make the noise more pleasant, obscure the noise, reduce the noise, remove the noise, and replace the noise) toward different types of noises from a range of categories (living beings, mechanical, and environmental) and varying degrees of relevance. It was discovered that respondents’ degrees of annoyance varied according to the kind of noise and its relevance to them. Additionally, respondents had a strong tendency to reduce irrelevant noise and retain more relevant noise. Based on our findings, we discuss how AAR can assist users in coping with noise whilst retaining relevant information through selectively suppressing or altering the noise, as appropriate.

    https://dl.acm.org/doi/10.1145/3678957.3685750

    続きを読む 一部表示
    14 分
  • ASSETS 2024: SoundHapticVR: Head-Based Spatial Haptic Feedback for Accessible Sounds in Virtual Reality for Deaf and Hard of Hearing Users
    2024/11/09

    Pratheep Kumar Chelladurai, Ziming Li, Maximilian Weber, Tae Oh, and Roshan L Peiris. 2024. SoundHapticVR: Head-Based Spatial Haptic Feedback for Accessible Sounds in Virtual Reality for Deaf and Hard of Hearing Users. In Proceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '24). Association for Computing Machinery, New York, NY, USA, Article 31, 1–17. https://doi.org/10.1145/3663548.3675639

    Virtual Reality (VR) systems use immersive spatial audio to convey critical information, but these audio cues are often inaccessible to Deaf or Hard-of-Hearing (DHH) individuals. To address this, we developed SoundHapticVR, a head-based haptic system that converts audio signals into haptic feedback using multi-channel acoustic haptic actuators. We evaluated SoundHapticVR through three studies: determining the maximum tactile frequency threshold on different head regions for DHH users, identifying the ideal number and arrangement of transducers for sound localization, and assessing participants’ ability to differentiate sound sources with haptic patterns. Findings indicate that tactile perception thresholds vary across head regions, necessitating consistent frequency equalization. Adding a front transducer significantly improved sound localization, and participants could correlate distinct haptic patterns with specific objects. Overall, this system has the potential to make VR applications more accessible to DHH users.

    https://dl.acm.org/doi/10.1145/3663548.3675639

    続きを読む 一部表示
    13 分
  • ASSETS 2024: SeaHare: An omidirectional electric wheelchair integrating independent, remote and shared control modalities
    2024/11/09

    Giulia Barbareschi, Ando Ryoichi, Midori Kawaguchi, Minato Takeda, and Kouta Minamizawa. 2024. SeaHare: An omidirectional electric wheelchair integrating independent, remote and shared control modalities. In Proceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '24). Association for Computing Machinery, New York, NY, USA, Article 9, 1–16. https://doi.org/10.1145/3663548.3675657

    Depending on one’s needs electric wheelchairs can feature different interfaces and driving paradigms with control handed to the user, a remote pilot, or shared. However, these systems have generally been implemented on separate wheelchairs, making comparison difficult. We present the design of an omnidirectional electric wheelchair that can be controlled using two sensing seats detecting changes in the centre of gravity. One of the sensing seats is used by the person on the wheelchair, whereas the other is used as a remote control by a second person. We explore the use of the wheelchair using different control paradigms (independent, remote, and shared) from both the wheelchair and the remote control seat with 5 dyads and 1 triad of participants, including wheelchair users and non. Results highlight key advantages and disadvantages of the SeaHare in different paradigms, with participants’ perceptions affected by their skills and lived experiences, and reflections on how different control modes might suit different scenarios. https://dl.acm.org/doi/10.1145/3663548.3675657

    続きを読む 一部表示
    13 分
  • ASSETS 2024: Brain Body Jockey project: Transcending Bodily Limitations in Live Performance via Human Augmentation
    2024/11/07

    Giulia Barbareschi, Songchen Zhou, Ando Ryoichi, Midori Kawaguchi, Mark Armstrong, Mikito Ogino, Shunsuke Aoiki, Eisaku Ohta, Harunobu Taguchi, Youichi Kamiyama, Masatane Muto, Kentaro Yoshifuji, and Kouta Minamizawa. 2024. Brain Body Jockey project: Transcending Bodily Limitations in Live Performance via Human Augmentation. In Proceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '24). Association for Computing Machinery, New York, NY, USA, Article 18, 1–14. https://doi.org/10.1145/3663548.3675621

    Musicians with significant mobility limitations, face unique challenges in being able to use their bodies to interact with fans during live performances. In this paper we present the results of a collaboration between a professional DJ with advanced Amyotrophic Lateral Sclerosis and a group of technologists and researchers culminating in two public live performances leveraging human augmentation technologies to enhance the artist’s stage presence. Our system combines Brain Machine Interface, and accelerometer based trigger, to select pre-programmed moves performed by robotic arms during a live event, as well as for facilitating direct physical interaction during a “Meet the DJ” event. Our evaluation includes ethnographic observations and interviews with the artist and members of the audience. Results show that the system allowed artist and audience to feel a sense of unity, expanded the imagination of creative possibilities, and challenged conventional perceptions of disability in the arts and beyond.

    https://dl.acm.org/doi/10.1145/3663548.3675621

    続きを読む 一部表示
    14 分
  • ISMAR 2024: Searching Across Realities: Investigating ERPs and Eye-Tracking Correlates of Visual Search in Mixed Reality
    2024/11/07

    F. Chiossi, I. Trautmannsheimer, C. Ou, U. Gruenefeld and S. Mayer, "Searching Across Realities: Investigating ERPs and Eye-Tracking Correlates of Visual Search in Mixed Reality," in IEEE Transactions on Visualization and Computer Graphics, vol. 30, no. 11, pp. 6997-7007, Nov. 2024, doi: 10.1109/TVCG.2024.3456172.

    Mixed Reality allows us to integrate virtual and physical content into users' environments seamlessly. Yet, how this fusion affects perceptual and cognitive resources and our ability to find virtual or physical objects remains uncertain. Displaying virtual and physical information simultaneously might lead to divided attention and increased visual complexity, impacting users' visual processing, performance, and workload. In a visual search task, we asked participants to locate virtual and physical objects in Augmented Reality and Augmented Virtuality to understand the effects on performance. We evaluated search efficiency and attention allocation for virtual and physical objects using event-related potentials, fixation and saccade metrics, and behavioral measures. We found that users were more efficient in identifying objects in Augmented Virtuality, while virtual objects gained saliency in Augmented Virtuality. This suggests that visual fidelity might increase the perceptual load of the scene. Reduced amplitude in distractor positivity ERP, and fixation patterns supported improved distractor suppression and search efficiency in Augmented Virtuality. We discuss design implications for mixed reality adaptive systems based on physiological inputs for interaction.

    https://ieeexplore.ieee.org/document/10679197

    続きを読む 一部表示
    16 分
  • ISMAR 2024: “As if it were my own hand”: inducing the rubber hand illusion through virtual reality for motor imagery enhancement
    2024/11/04

    S. Cheng, Y. Liu, Y. Gao and Z. Dong, "“As if it were my own hand”: inducing the rubber hand illusion through virtual reality for motor imagery enhancement," in IEEE Transactions on Visualization and Computer Graphics, vol. 30, no. 11, pp. 7086-7096, Nov. 2024, doi: 10.1109/TVCG.2024.3456147

    Brain-computer interfaces (BCI) are widely used in the field of disability assistance and rehabilitation, and virtual reality (VR) is increasingly used for visual guidance of BCI-MI (motor imagery). Therefore, how to improve the quality of electroencephalogram (EEG) signals for MI in VR has emerged as a critical issue. People can perform MI more easily when they visualize the hand used for visual guidance as their own, and the Rubber Hand Illusion (RHI) can increase people's ownership of the prosthetic hand. We proposed to induce RHI in VR to enhance participants' MI ability and designed five methods of inducing RHI, namely active movement, haptic stimulation, passive movement, active movement mixed with haptic stimulation, and passive movement mixed with haptic stimulation, respectively. We constructed a first-person training scenario to train participants' MI ability through the five induction methods. The experimental results showed that through the training, the participants' feeling of ownership of the virtual hand in VR was enhanced, and the MI ability was improved. Among them, the method of mixing active movement and tactile stimulation proved to have a good effect on enhancing MI. Finally, we developed a BCI system in VR utilizing the above training method, and the performance of the participants improved after the training. This also suggests that our proposed method is promising for future application in BCI rehabilitation systems.

    https://ieeexplore.ieee.org/document/10669780

    続きを読む 一部表示
    19 分
  • ISMAR 2024: Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended Reality
    2024/11/01

    Pavel Manakhov, Ludwig Sidenmark, Ken Pfeuffer, and Hans Gellersen. 2024. Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended Reality. IEEE Transactions on Visualization and Computer Graphics 30, 11 (Nov. 2024), 7234–7244. https://doi.org/10.1109/TVCG.2024.3456153

    Eye tracking filters have been shown to improve accuracy of gaze estimation and input for stationary settings. However, their effectiveness during physical movement remains underexplored. In this work, we compare common online filters in the context of physical locomotion in extended reality and propose alterations to improve them for on-the-go settings. We conducted a computational experiment where we simulate performance of the online filters using data on participants attending visual targets located in world-, path-, and two head-based reference frames while standing, walking, and jogging. Our results provide insights into the filters' effectiveness and factors that affect it, such as the amount of noise caused by locomotion and differences in compensatory eye movements, and demonstrate that filters with saccade detection prove most useful for on-the-go settings. We discuss the implications of our findings and conclude with guidance on gaze data filtering for interaction in extended reality.

    https://ieeexplore.ieee.org/document/10672561

    続きを読む 一部表示
    20 分