2D Object Detection and Tracking based on Stereo Vision Camera and Lidar Data

2D Object Detection and Tracking based on Stereo Vision Camera and Lidar Data

2020, Oct 07    

Environmental perception is a fundamental function of an autonomous system such as robot or self-driving cars. The process requires set of sensing devices from which the vehicle obtains crucial information of the surrounding environment, including detecting and tracking nearby objects. But every sensor has its own limitations, hence it would be naïve to fully believe data from an individual sensor. One of the proposed solutions is to fuse data from multiple sensors, also known as sensor fusion, such that the extracted information has less uncertainty.

Along with my friend Arlin Nur Ramadhani, We worked on this project as a requirement to obtain bachelor’s degree in Engineering Physics . During which, we was supervised by Augie Widyotriatmo, S.T., M.T., Ph.D., and Prof.Dr.Ing.Ir Yul Yunazwin Nazaruddin, M.Sc., DIC. This project is a small fraction of a larger project that would be called “Autonomous Vehicle Project of Bandung Institute of Technology”.

Architecture

architecture System Architecture

The proposed design aims to create detection and tracking system that use two kind of sensors: Intel RealSense D435 stereo vision camera and 2D RPLiDAR Laser Scanner, to perceive the environment. Several steps of pre-processing are needed by before object detection algorithm so it can utilize the data while also discards unused data. 3D Depth output from stereo vision camera is converted into 2D depth so that the data is similar to LiDAR. Meanwhile, LiDAR data are pre-processed such that only data within camera field of view (79°) is further processed to the object detection algorithm. Then, the resulted detection is fused using multivariate Gaussian principles, which produces a new distribution; fused detection. This detection will be fed to particle filter-based object tracking algorithm which will further estimate dynamic state of the detected objects such as velocity and acceleration. We also use ROS as the main framework for the algorithm.

Object Detection

Object detection example Object detection example

The detection working principle is largely based on obstacle_detector package created by Mateusz Przybyla, which used a density-based clustering method to group point clouds and create a geometric representation of objects within the sensor vicinity.

Tracking

The tracking is defined as a process to estimate dynamic state of the detected objects. The tracking process itself has several sub-processes which consists of:

  1. Data association to ensure each object are tracked consistently.
    We use simple linear sum assignment or hungarian algorithm to solve correspondence problem between multiple detections
  2. The data/sensor fusion process.
    Using Gaussian multivariate multiplication principle, detection results from each sensor are treated as Gaussian distribution with the measurement value as the mean and an uncertainty value that is obtained through an experiment.
  3. State estimation using particle filter
    Since the camera and LiDAR systems can only make partial observations (i.e. only measuring position and radius of an object), and random perturbations are present in the sensors as well as in the objects, the proposed systems will use particle filter that will estimate the internal states of the objects by generating random samples that will approximate the posterior distributions of the states given set of detected objects. Particle filter can be operated under weak assumptions so it will not be a problem to use Gaussian equation during the estimation process.

Result

Object tracking example Tracking a jogging pedestrian Object tracking example Tracking a motorbike and a jogging pedestrian in a more complex environment

Github

Visit the repository here to download

References

A. Ess, K. Schindler, B. Leibe and L. Van. Gool, “Object Detection and Tracking for Autonomous Navigation in Dynamic Environments,” The International Journal of Robotic Research, vol. 29, no. 14, pp. 1707-1725, 2010.

M. Przybyla, “Detection and Tracking of 2D Geometric Obstacles from LRF data,” in 11th International Workshop on Robot Motion and Control (RoMoCo), 2017.

M. L. J. Z. Piotr Kozierski, “Resampling in Particle Filtering - Comparison,” Studia z automatyki I Informatyki, pp. 40-58, 2013.