top of page

Jaws 2.

Recording+2025-08-16+172646.gif

1) This visual shows the preload of a 7-sequence autonomous setting. Serving as the foundation of the routine, I fused 4 computer vision pipelines via simplified kalman filter to determine a global pose of the robot in field-space with less than 1cm of error.

The Context.

After getting onboarded on Code Orange (FRC 3476), I was tasked with dealing with an issue the team faced all season-outdated code.

The Problem.

Last season's code was written in the "timed-robot" skeleton, which was very obsolete and not as advanced as the new "command-based" structure most robots operate on.

The Solution.

I integrated a 3-layer hardware abstraction system that splits robot action and motor hardware functionality from subsystem goals. I also introduced a finite state machine to improve the overall readability and flow of the code.

The Outcome.

As a result of the new software structure, I was able to improve autonomous routine performance by 33% and streamline troubleshooting during high-stress competitions.

algae (1).gif

Image 2) This graphic shows the scoring capabilities of the robot. It was able to collect and eject large yoga balls into an 8 foot net area.

climb.gif

Image 3) This video represents the climbing sequence for Jaws 2, lifting the 130+ pound robot off the ground via precise servo timing and beam break sensing.

  • GitHub
  • LinkedIn
  • Instagram
bottom of page