The BRIDGES project was structured in 7 work packages. All work packages represent the iterative and incremental manner to ensure the achievement of the project’s goals.
Public deliverables, reports, publications, and other material produced during the project’s lifetime is made available on this page.
Eliciting requirements for a multisensory eXtended Reality platform for training and informal learning (paper)
The paper – presented during the third day of CHIGreece2021 – summarizes the research goals, methodology and outcomes that have been held by the BRIDGES project to elicit the users needs and requirements and establish a user-centered iterative approach.
Exploring the Effect of Personality Traits in VR Interaction: The Emergent Role of Perspective-Taking in Task Performance
In this work, our partner NKUA (National Kapodistrian University of Athens) and Athena Research and Innovation Center explored the effect of personality traits on user interaction in virtual reality (VR), on the less widely studied aspect of task performance during object manipulation. We conducted an experiment measuring the performance of 39 users interacting with a virtual environment using the virtual hand metaphor to execute a simple selection and positioning task, with or without virtual obstacles. Our findings suggest concrete correlations between user personality traits and behavior data.
Creating Informal Learning and First Responder Training XR Experiences with the ImmersiveDeck
In recent years eXtended Reality (XR) technologies have matured and have become affordable, yet creating XR experiences for training and learning in many cases is still a time-consuming and costly process, hindering widespread adoption. One factor driving effort is that content and features commonly required by many applications get re-implemented for each experience, instead of sharing and reusing these resources by means of a common platform. In this paper we present two XR experiences in the context of informal learning and first responder training along with the shared platform they have been created with and the creation process. Furthermore, we have technically evaluated relevant parts of the platform for feasibility of use with experience requirements and confirmed applicability. Finally, we present an informal expert evaluation of the content creation process’s user experience for the informal learning experience along with guidelines derived from the findings.
Comparing Different Grasping Visualizations for Object Manipulation in VR using Controllers
Virtual grasping is one of the most common and important interactions performed in a Virtual Environment (VE). Even though there has been substantial research using hand tracking methods exploring different ways of visualizing grasping, there are only a few studies that focus on handheld controllers. This gap in research is particularly crucial, since controllers remain the most used input modality in commercial Virtual Reality (VR). Extending existing research, we designed an experiment comparing three different grasping visualizations when users are interacting with virtual objects in immersive VR using controllers. We examine the following visualizations: the Auto-Pose (AP), where the hand is automatically adjusted to the object upon grasping; the Simple-Pose (SP), where the hand closes fully when selecting the object; and the Disappearing-Hand (DH), where the hand becomes invisible after selecting an object, and turns visible again after positioning it on the target. We recruited 38 participants in order to measure if and how their performance, sense of embodiment, and preference are affected. Our results show that while in terms of performance there is almost no significant difference in any of the visualizations, the perceived sense of embodiment is stronger with the AP, and is generally preferred by the users. Thus, this study incentivizes the inclusion of similar visualizations in relevant future research and VR experiences.