This video contains the first successful run our autonomous system JARVIC has ever made. We can see its first cautious steps, since there is yet a lot of tuning and validation to do.
The system is also heavily limited in its perception range, the reasons being the nature of the perception pipeline and the used LiDAR sensor: a Velodyne Puck (VLP-16) HiRes. First, possible candidates for cones were found by clustering the obtained pointcloud. In the second step, those 3-dimensional points were mapped to cutouts pf the camera images, on which a classification NN determines the color of a cone, if one is seen. Using this pipeline, we have to accept a lot of false positives in trade of a reasonable perception range. But enhancements are on their way ...