Competition Mode
Fusion mode combines IMU stability with vision correction. The key rule is simple: IMU runs continuously, camera corrections are applied only in trusted windows.
Step 1
Define Source Roles
IMU: continuous heading baseline (fast, stable).
Webcam/AprilTag: absolute correction when tags are valid.
Localizer: keeps translation estimate between corrections.
Step 2
Implement a Safe Correction Gate
Use correction only when robot is in a stable state:
if (allowedToShoot && !manualControl && isRobotStationary) {
aprilTagIdentification.getRobotPose();
if (aprilTagIdentification.locTagFound) {
calculatedRobotPose_X = aprilTagIdentification.robotPose_x;
calculatedRobotPose_Y = aprilTagIdentification.robotPose_y;
robotAngleAprilTag = aprilTagIdentification.bearingAngle;
gyroscope.resetHeading();
gyroscope.setAngleOffset(36.5 - robotAngleAprilTag);
drive.setPose(new Pose(calculatedRobotPose_X, calculatedRobotPose_Y,
Math.toRadians(126.5 - robotAngleAprilTag)));
}
}
Step 3
Handle Alliance Variants Explicitly
Use separate angle offsets for red/blue.
Do not merge alliance math into one undocumented expression.
Validate each alliance independently on field.
Step 4
Add Fallback Behavior
If tag confidence is low, continue with IMU-only aiming.
Do not force correction when detections flicker.
Expose a one-button fallback toggle to drivers.
Step 5
Validate Fusion Robustness
Parked correction test (X/Y/bearing convergence).
Short motion segments + correction windows.
Full-cycle stress test with burst shooting active.
Match Rule
Run fusion as primary only after IMU-only and webcam-only modes are independently validated.
Fusion should improve reliability, not hide unstable base subsystems.