DBUS: Human Driving Behavior Understanding System

Abstract

Human driving behavior understanding is a key ingredient for intelligent transportation systems. Either developing self-driving car drives like humans or building V2X systems to improve human driving experience, we need to understand how humans drive and interact with environments. Massive human driving data collected by top ride-sharing platforms and fleet management companies, offers the potential for in-depth understanding of human driving behavior. In this paper, we present DBUS, a real-time driving behavior understanding system which works with front-view videos, GPS/IMU signals collected from daily driving scenarios. Unlike previous work of driving behavior analysis, DBUS focuses on not only the recognition of basic driving actions but also the identification of driver’s intentions and attentions. The analysis procedure is designed by mimicking the human intelligence for driving, powered with representation capability of deep neural networks as well as recent advances in visual perception, video temporal segmentation, attention mechanism, etc. Beyond systematic driving behavior analysis, DBUS also supports efficient behavior-based driving scenario search and retrieval, which is essential for practical application when working with large-scale human driving scenario dataset. We perform extensive evaluations of DBUS in term of inference accuracy of intentions, interpretability of inferred driver’s attentions, as well as system efficiency. We also provide insightful intuitions as to why and how certain components work based on experience in the development of the system.

Publication
2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW)
Zhengping Che
Zhengping Che
X-Humanoid
Yan Liu
Yan Liu
Professor, Computer Science Department