Denkey Car is a Self-Driving RC car project that potentially consists of multiple RC cars backed by different sets of software to mimic distinct real world scenarios like highway, parking lot and off-road driving.
A standard Donkey build. Run the default inference model on the Pi. No extra juice added yet.
No rules! But if you want to have something to hold on to, the following builds are good starting point. All have parts list.
- Donkey, $200
- Berkeley BARC, min. $1000
- MIT RACECAR, $4500
- HyphaRos RACECAR, $600, low cost version of MIT RACECAR
- GT AutoRally, $30000
Donkey is a design built to run around tracks bounded by painted lane marks.
- Implement lane changing and obstacle avoiding behaviors. (requires a dual-lane track that I have no access to)
- Real-time lane following by steering the car according to the lane curvature in a recorded video being played.
- Running ROS nodes on more powerful desktop and Pi to enable more complex inference model. (for unknown reason Pi's built-in Wi-Fi doesn't have the bandwidth to support this feature)
- Mount more sensors, hack the donkey framework to support custom sensors.
- Use those sensors to do visual SLAM (RPLidar doesn't work on painted lane)
- Implement RNN to mitigate driving signal latency. Don't feel the urge to implement it at the moment though.
- cycloid is a platform that uses wheel encoders and IMU to do visual SLAM.
- Ghost is another project that employs the IMU + encoders combination to do SLAM.
This is likely to be my second car to implement indoor navigation from point A to point B while avoiding 3D obstacles.
Because installing encoders to RC cars can be tricky. An open source ESC named VESC can provide odometry data out of box and make sensorless brushless motor responsive under low speed (good for training). This fact makes a second car inevitable. Besides, RPLidar and depth camera require more powerful SBCs than Raspberry Pi.
Another benefit of this car is it can be tested in small and unstructured apartment where Donkey requires a large enough track (a luxury I don't have) to test various behaviors.
- Use RPLidar or depth camera to build a map (SLAM)
- Use the map to navigate (Hybrid A* possibly)
This is the final product I am intended to reproduce.