An Unsupervised Transfer Learning Framework for Visible-Thermal Pedestrian Detection

Sensors (Basel). 2022 Jun 10;22(12):4416. doi: 10.3390/s22124416.

Abstract

Dual cameras with visible-thermal multispectral pairs provide both visual and thermal appearance, thereby enabling detecting pedestrians around the clock in various conditions and applications, including autonomous driving and intelligent transportation systems. However, due to the greatly varying real-world scenarios, the performance of a detector trained on a source dataset might change dramatically when evaluated on another dataset. A large amount of training data is often necessary to guarantee the detection performance in a new scenario. Typically, human annotators need to conduct the data labeling work, which is time-consuming, labor-intensive and unscalable. To overcome the problem, we propose a novel unsupervised transfer learning framework for multispectral pedestrian detection, which adapts a multispectral pedestrian detector to the target domain based on pseudo training labels. In particular, auxiliary detectors are utilized and different label fusion strategies are introduced according to the estimated environmental illumination level. Intermediate domain images are generated by translating the source images to mimic the target ones, acting as a better starting point for the parameter update of the pedestrian detector. The experimental results on the KAIST and FLIR ADAS datasets demonstrate that the proposed method achieves new state-of-the-art performance without any manual training annotations on the target data.

Keywords: deep learning; domain adaptation; multispectral fusion; pedestrian detection; unsupervised transfer learning.

MeSH terms

  • Algorithms
  • Automobile Driving*
  • Humans
  • Lighting
  • Machine Learning
  • Pedestrians*

Grants and funding

The research leading to these results has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skodowska-Curie grant agreement No. 765866—ACHIEVE and the ECSEL joint undertaking grant agreement No. 876487—NextPerception.