Pedestrian Detection for Autonomous Cars: Inference Fusion of Deep Neural Networks

Research output: Contribution to journalArticlepeer-review

15 Scopus citations

Abstract

Network fusion has been recently explored as an approach for improving pedestrian detection performance. However, most existing fusion methods suffer from runtime efficiency, modularity, scalability, and maintainability due to the complex structure of the entire fused models, their end-to-end training requirements, and sequential fusion process. Addressing these challenges, this paper proposes a novel fusion framework that combines asymmetric inferences from object detectors and semantic segmentation networks for jointly detecting multiple pedestrians. This is achieved by introducing a consensus-based scoring method that fuses pair-wise pixel-relevant information from the object detector and the semantic segmentation network to boost the final confidence scores. The parallel implementation of the object detection and semantic segmentation networks in the proposed framework entails a low runtime overhead. The efficiency and robustness of the proposed fusion framework are extensively evaluated by fusing different state-of-the-art pedestrian detectors and semantic segmentation networks on a public dataset. The generalization of fused models is also examined on new cross pedestrian data collected through an autonomous car. Results show that the proposed fusion method significantly improves detection performance while achieving competitive runtime efficiency.
Original languageEnglish
Pages (from-to)23358-23368
Number of pages11
JournalIEEE Transactions on Intelligent Transportation Systems
Volume23
Issue number12
DOIs
StatePublished - Dec 1 2022

Keywords

  • autonomous vehicles
  • deep learning
  • fusion
  • object detection
  • Pedestrian detection
  • semantic segmentation

Fingerprint

Dive into the research topics of 'Pedestrian Detection for Autonomous Cars: Inference Fusion of Deep Neural Networks'. Together they form a unique fingerprint.

Cite this