Supported by the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIT)
Real-time Version of Lateral Distance Project:
https://github.com/ChicagoPark/Lateral_Realtime
Goal: Obtaining accurate lateral distance between vehicles and adjacent lanes
The
precise localization of nearby vehicles
in advanced driving assistance systems or autonomous driving systems is imperative for path planning or anomaly detection by monitoring and predicting the future behavior of the vehicles. Mainly,the lateral position of a vehicle with respect to lane markers is essential information
for such tasks.
We
propose a novel method to accurately estimate the lateral distance
of a front vehicle to lane markersby combining camera and LiDAR sensors
and thefusion of deep neural networks
in 2D and 3D domains.
We utilized the RANdom SAmple Consensus (RANSAC) function to
discriminate point clouds for road and obstacles
separately.
We placed point clouds on the image from a 3D point cloud and assigned the color of points on the image according to the intensity of point cloud data for visualization. We
utilize projected output from the sensor dimension upgrading process for lane matching
.
We use
Ultrafase-ResNot model,
which can detect the lane in harsh weather conditions and various environments. Additionally, to get stable and constant lane detection results, we fitted each lane's pointsusing a quadratic curve
andplotted the lane
as far as possible within a visible road area.
We use the
PV-RCNN model,
which considers the orientation of a vehicle and covers the vehicle with twelve lines to indicate bounding boxes. Through this 3D detection result, wecan estimate the lateral distance from the exact mid-low point of vehicles to the lanes
To obtain matching point points on lanes,
first, we get the quadratic equations for the lane markers
on the image domain and get the corresponding pixels occupied by the curves. Next, wesearch for all projected points on the pixels
and keep them in dynamic memory. Through this process,we can upgrade the dimension reliably.
We obtain the lateral distance by
calculating the minimum distance
from the mid-low point of the vehicle bounding box to the curve. Last but not least, werepresent the lateral distance
in meters using text markers, and we indicate the lateral locations of vehicles using pink line markers.
Our framework provides the location of other vehicles regarding the lanes of the ego-vehicle, whereas previous work usually offers the vehicle location with respect to the ego-vehicle. We strongly believe that our work can provide more helpful information for detecting the anomaly behavior or the trajectory prediction of surrounding vehicles. Furthermore, the lateral distance values can be shared with networked vehicles around an ego-vehicle to strengthen traffic safety. Thus, our work can be a
valuable addition to vehicle communication and network research
.
As
there is no dataset for vehicle’s lateral distances
measured from the lanes of an ego-vehicle, wecreated a dataset
by ourselves. We first extracted 1,281 frames suitable for this study from the KITTI dataset and then searched for the closest points to a vehicle on the left and right lanes of the ego vehicle.
Published full paper would be available in June, 2022.