Machine learning is not a buzzword for our team. It is a practical engineering layer integrated into the full robot workflow. We use it to detect game elements, pick up artifacts autonomously, and automate as many repeatable actions as possible so drivers can focus on match strategy.
Software Features on This Robot
Sensor Fusion + Vision Reliability
Odometry, IMU and camera tracking are fused for stable pose updates, with camera-based pose refresh to reduce drift during autonomous cycles.
Automated Artifact Cycle
Inverse kinematics auto-aim, autonomous artifact pickup, burst-shot sequencing and geofencing checks keep cycles fast, repeatable and safe in valid scoring zones.
Driver-First, Fail-Operational Control
Manual redundancy, safety interlocks and fast override paths keep the robot controllable under pressure even if a sensor pipeline degrades.
For the national stage we switched to Limelight (Raspberry Pi based) to benchmark it against our previous camera stack.
Performance Snapshot