The 21st Century is facing highly complex societal and intellectual challenges, that can only be solved when professionals with distinct abilities and resources join their efforts and collaborate. Interactive wall-sized displays provide large benefits for information visualisation and visual data analysis. Besides allowing for a better presentation of large amounts of data, they support -collocated- collaboration, as multiple users can access and view content at the same time and easily follow each other’s actions.
However, in many situations (e.g. the COVID-19 pandemic, geographical barriers) face-to-face collaboration is not feasible and needs to be replaced by remote collaboration. Conventional tools used to support it strongly limit non-verbal awareness information, leading to communication difficulties and additional efforts for staying engaged. This lack of awareness is increasingly relevant in the context of decision-making at interactive wall displays as in front of a large screen, collaborators are naturally making use of a large number of non-digital body movements and hand gestures.
To better mediate awareness information and facilitate communication, previous work suggests adding additional visual cues into the common workspace or the live video stream. Such cues have been proposed for smaller workspaces like tabletops and have only seldomly been investigated in the context of remote collaboration across two or more wall displays.
The ReSurf project funded by the FNR through its CORE programme addresses the question of how person-oriented awareness cues need to be designed in order to enhance remote collaboration across two physically distributed wall displays.
To do so, LIST researchers will combine design-based research, user centred design and user studies with the aim of exploring and studying how mutual awareness can be enhanced for collaborative decision-making in a distributed wall-display setup.
They will make use of different LIST wall displays (VisWall, 360° Immersive Arena, DemoWall), and conduct a user study to find out how awareness information is shared in a well-functioning, collocated decision-making context. In an iterative approach, and by progressively integrating results from focus groups and user studies, they will design audio-visual awareness cues that make use of body movements (proxemics, postures, and hand gestures) and eye gaze to support remote collaboration. A series of user studies will allow to learn about the role and effectiveness of different types of cues.
ReSurf will generate scientific knowledge on the optimal design of awareness support in remotely connected wall displays. Moreover, it will contribute to the next generation of remote decision-making tools, where people can collaborate smoothly, and enjoy an experience that is as close as possible to a collocated situation.
This innovative project will not only enable to collect
behavioural patterns for establishing and maintaining awareness in collocated settings, but also to identify new types of audio-visual cues for mediating awareness information over distance and to provide empirically validated results on the role and effectiveness of awareness cues in mixed-presence settings.
This project is supported by the Luxembourg National Research Fund under the CORE program (Grant nr C21/IS/15883550).
Coppens, A., Maquil, V. (2024). Skeletal Data Matching and Merging from Multiple RGB-D Sensors for Room-Scale Human Behaviour Tracking. In: Luo, Y. (eds) Cooperative Design, Visualization, and Engineering. CDVE 2024. Lecture Notes in Computer Science, vol 15158. Springer, Cham. https://doi.org/10.1007/978-3-031-71315-6_30
Coppens, A., Schwartz, L., Maquil, V. (2024). Workspace Awareness Needs in Mixed-Presence Collaboration on Wall-Sized Displays. In: Luo, Y. (eds) Cooperative Design, Visualization, and Engineering. CDVE 2024. Lecture Notes in Computer Science, vol 15158. Springer, Cham. https://doi.org/10.1007/978-3-031-71315-6_3
Coppens A., Hermen J., Schwartz L., Moll C., and Maquil V. (2024). Supporting Mixed-Presence Awareness across Wall-Sized Displays Using a Tracking Pipeline based on Depth Cameras. Proc. ACM Hum.-Comput. Interact. 8, EICS, Article 260 (June 2024), 32 pages. https://doi.org/10.1145/3664634
Maquil, V., Schwartz, L., & Coppens, A. (2024). Workspace Awareness Cues to Facilitate Mixed-Presence Collaborative Decision-Making on Wall-Sized Displays. ERCIM News: Extended Reality, 2024.
Anastasiou, D., Coppens, A. & Maquil, V. (2024). Gesture combinations during collaborative decision-making at wall displays. i-com, 23(1), 57-69. https://doi.org/10.1515/icom-2023-0037
Maquil V., Anastasiou D., Afkari H., Coppens A., Hermen J., and Schwartz L.. (2023). Establishing Awareness through Pointing Gestures during Collaborative Decision-Making in a Wall-Display Environment. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems (CHI EA ’23), 7 pages. https://doi.org/10.1145/3544549. 3585830
Vandenabeele, L., Afkari, H., Hermen, J., Deladiennée, L., Moll, C., & Maquil, V. (2022). DeBORAh: A Web-Based Cross-Device Orchestration Layer. In Proceedings of the 2022 International Conference on Advanced Visual Interfaces, Article 58, 1–3. https://doi.org/10.1145/3531073.3534483