Format

Send to

Choose Destination
Sensors (Basel). 2017 Nov 9;17(11). pii: E2585. doi: 10.3390/s17112585.

Textile Pressure Mapping Sensor for Emotional Touch Detection in Human-Robot Interaction.

Author information

1
German Research Center for Artificial Intelligence, 67663 Kaiserslautern, Germany. bo.zhou@dfki.de.
2
Department Computer Science, University of Kaiserslautern, 67663 Kaiserslautern, Germany. carlos_andres.velez_altamirano@dfki.uni-kl.de.
3
Department Computer Science, University of Kaiserslautern, 67663 Kaiserslautern, Germany. heber.cruz_zurian@dfki.de.
4
Swedish School of Textiles, University of Borås, 50190 Borås, Sweden. seyedreza.atefi@hb.se.
5
School of Informatics, University of Skövde, 54128 Skövde, Sweden. erik.billing@his.se.
6
Swedish School of Textiles, University of Borås, 50190 Borås, Sweden. fernando.seoane_martinez@hb.se.
7
Institute for Clinical Science, Intervention and Technology, Karolinska Institutet, 17177 Stockholm, Sweden. fernando.seoane_martinez@hb.se.
8
Department Biomedical Engineering, Karolinska University Hospital, 14186 Stockholm, Sweden. fernando.seoane_martinez@hb.se.
9
German Research Center for Artificial Intelligence, 67663 Kaiserslautern, Germany. paul.lukowicz@dfki.de.
10
Department Computer Science, University of Kaiserslautern, 67663 Kaiserslautern, Germany. paul.lukowicz@dfki.de.

Abstract

In this paper, we developed a fully textile sensing fabric for tactile touch sensing as the robot skin to detect human-robot interactions. The sensor covers a 20-by-20 cm 2 area with 400 sensitive points and samples at 50 Hz per point. We defined seven gestures which are inspired by the social and emotional interactions of typical people to people or pet scenarios. We conducted two groups of mutually blinded experiments, involving 29 participants in total. The data processing algorithm first reduces the spatial complexity to frame descriptors, and temporal features are calculated through basic statistical representations and wavelet analysis. Various classifiers are evaluated and the feature calculation algorithms are analyzed in details to determine each stage and segments' contribution. The best performing feature-classifier combination can recognize the gestures with a 93 . 3 % accuracy from a known group of participants, and 89 . 1 % from strangers.

KEYWORDS:

human-robot interaction; smart textiles; tactile sensing

PMID:
29120389
PMCID:
PMC5713507
DOI:
10.3390/s17112585
[Indexed for MEDLINE]
Free PMC Article

Supplemental Content

Full text links

Icon for Multidisciplinary Digital Publishing Institute (MDPI) Icon for PubMed Central
Loading ...
Support Center