Project number
25503
Organization
UA Department of Systems and Industrial Engineering
Offering
ENGR498-S2025-F2025
Racing the Sun is a STEM event where regional high school teams convert gasoline-powered go-karts into solar-powered electric go-karts and then compete in an annual race event at Musselman Honda racetrack. The Racing the Sun Autonomously system aims to leverage this opportunity to showcase autonomous driving capabilities to motivate the students to pursue advanced engineering studies. Moreover, the project’s intention is to enable Racing the Sun participants to develop their own autonomous platform by releasing the software package as open-source software.
The team designed and manufactured a system that completed safe, repeatable autonomous laps at the Musselman Honda racetrack on a solar-electric kart and publish an open-source package for external use. The system uses a Raspberry Pi compute stack running Robot Operating System (ROS), a series of cameras as primary sensors, a servo motor for steering, and a touchscreen for setup and mode selection. During a run, the camera finds track edges and the software plans a smooth path with appropriate speeds. The system has three operational modes - manual, remote-controlled, and autonomous (with an emergency stop). The interface supports data logging, and easy parameter tuning. All components are packaged as ROS nodes with launch files and clear documentation, enabling teams to retrain or redevelop the vision model.
The team designed and manufactured a system that completed safe, repeatable autonomous laps at the Musselman Honda racetrack on a solar-electric kart and publish an open-source package for external use. The system uses a Raspberry Pi compute stack running Robot Operating System (ROS), a series of cameras as primary sensors, a servo motor for steering, and a touchscreen for setup and mode selection. During a run, the camera finds track edges and the software plans a smooth path with appropriate speeds. The system has three operational modes - manual, remote-controlled, and autonomous (with an emergency stop). The interface supports data logging, and easy parameter tuning. All components are packaged as ROS nodes with launch files and clear documentation, enabling teams to retrain or redevelop the vision model.