Data acquisition towards defining a multimodal interaction model for human-assistive robot communication

Fotinea, S.-E., Efthimiou, E., Dimou, A.-L., Goulas, T., Karioris, P., Peer, A., Maragos, P., Tzafestas, C., Kokkinos, I., Hauer, K., Mombaur, K., Koumpouros, I. and Stanzyk, B. (2014) Data acquisition towards defining a multimodal interaction model for human-assistive robot communication. In: Universal Access in Human/Computer Interaction. Aging and Assistive Environments, UAHCI/HCII. Available from:

Full text not available from this repository


We report on the procedures followed in order to acquire a multimodal sensory corpus that will become the primary source of data retrieval, data analysis and testing of mobility assistive robot prototypes in the European project MOBOT. Analysis of the same corpus with respect to all sensorial data will lead to the definition of the multimodal interaction model; gesture and audio data analysis is foreseen to be integrated into the platform in order to facilitate the communication channel between end users and the assistive robot prototypes expected to be the project’s outcomes. In order to allow estimation of the whole range of sensorial data acquired, we will refer to the data acquisition scenarios followed in order to obtain the required multisensory data and to the initial post-processing outcomes currently available.

Item Type:Conference or Workshop Item (Paper)
Uncontrolled Keywords:assistive robot, natural HRI, multimodal communication model, multisensory data acquisition
Faculty/Department:Faculty of Environment and Technology > Department of Engineering Design and Mathematics
ID Code:31683
Deposited By: Professor A. Peer
Deposited On:27 Apr 2017 11:21
Last Modified:30 Apr 2017 04:13

Request a change to this item

Document Downloads

Total Document Downloads

More statistics for this item...