Skeletal Data Matching and Merging from Multiple RGB-D Sensors for Room-Scale Human Behaviour Tracking

Auteurs

Coppens A., Maquil V.

Référence

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 15158 LNCS, pp. 289-298, 2024

Description

Affordable room-scale human behaviour tracking may rely on commodity RGB-D sensors, that are sensitive to occlusions from objects or other persons that might be in the way. To alleviate the occlusion issue and extend the tracking range while strengthening its accuracy, it is possible to rely on multiple RGB-D sensors and perform data fusion. But fusing the data in a meaningful manner raises additional challenges related to the calibration of the sensors relative to each other to provide a common frame of reference, but also regarding skeleton matching and merging when actually combining the data. In this paper, we discuss our approach to tackle these challenges and present the results we achieved, through aligned point clouds and combined skeleton lists. These results successfully enable unobtrusive and occlusion-resilient human behaviour tracking at room scale, that may be used as input for interactive applications as well as (possibly remote) collaborative systems.

Lien

doi:10.1007/978-3-031-71315-6_30

Partager cette page :