Hemlock5712 / AdvantageKitSwerveTemplate

Template robot project with AdvantageKit, swerve drive, PathPlanner, and Limelight
GNU General Public License v3.0
14 stars 3 forks source link

Issue with zeroing swerve modules #22

Closed AydenLo17 closed 8 months ago

AydenLo17 commented 8 months ago

We are having a problem with zeroing our swerve modules. We use the SDS MK4 with 2 Falcon 500s on it. When we point all the wheels forward, and bevel gear out, then we take the absolute angle value from Advantagescope and put that into the offset on DriveConstants. Once we start driving the robot on stands, the wheels all point in different directions and spin in different directions

JosephTLockwood commented 8 months ago

Hey @AydenLo17 ,

Thanks for using trying out our template. I want to let you know that this differs from the official Advantage Kit Template that can be found here. We had issues similar to yours, and I would check these things.

  1. Make sure that your motors inverted are set to false if using the MK4 and not the MK4is
  2. We ended up using CTRE values. I would reset all values to 0 and deploy. Plot the AbsolutePosition and make sure you do Rotation2d.fromRotations the value it provides.
  3. Make sure that your gyro is correct. If none of these things solutions work, I would check out the official AdvantageKit and try their base format. I may have forgotten something when changing everything to this different format, and I would hate you wasting your time trying to debug one of my issues.

As a side note: I will add that we have this code working on several bots, so I'm pretty sure debugging step 2 will solve your issues. Let me know if you need any more help and best of luck!!!

CrispyBacon1999 commented 8 months ago

We ended up using CTRE values. I would reset all values to 0 and deploy. Plot the AbsolutePosition and make sure you do Rotation2d.fromRotations the value it provides.

On this, this would be the values pulled from Tuner X. We were having similar issues when we were configuring until we started pulling the values from tuner.

AydenLo17 commented 8 months ago

Got that fixed...

Now another problem. The pigeon seems to be not communicating. On advantagescope it shows up as connected, but the robot is driving in robot oriented and is also spinning in circles when doing speaker or amp mode.

JosephTLockwood commented 8 months ago

We are aware of this issue. The recent update that Mechanical Advantage had to pose estimation on the real world robot appears not to be working. I have reached out to them. It currently does work in sim if you would like to try that.

AydenLo17 commented 8 months ago

Gotcha, thank you for this information. I’ll just regularly check your GitHub for updates once Jonah responds to you and then we should be good to go. Thank you for this amazing base code!

On Tue, Jan 30, 2024 at 6:52 PM JosephTLockwood @.***> wrote:

We are aware of this issue. The recent update that Mechanical Advantage had to pose estimation on the real world robot appears not to be working. I have reached out to them. It currently does work in sim if you would like to try that.

— Reply to this email directly, view it on GitHub https://github.com/Hemlock5712/AdvantageKitSwerveTemplate/issues/22#issuecomment-1918107767, or unsubscribe https://github.com/notifications/unsubscribe-auth/A32AKBADMQ7W4QBMW4CD5XDYRGBTPAVCNFSM6AAAAABCR4YP3KVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMJYGEYDONZWG4 . You are receiving this because you were mentioned.Message ID: @.***>

JosephTLockwood commented 8 months ago

@AydenLo17 This issue should be resolved. I noticed in one of the logs that no timestamps from odometry were being added. It looks like I missed that on the last AdvantageKit update. It should be all good to go.

AydenLo17 commented 8 months ago

Sounds good, I'll check it out this morning when I get to school. I'll be sure to reach out if we have any additional issues. Thank you! Good luck this season!

AydenLo17 commented 8 months ago

For the brief amount of time I had to test the code with the updated ModuleIOTalonFX, it would appear that the odometry still does not update.

JosephTLockwood commented 8 months ago

We plan to test in 2.5 hours; I really want to test before pushing, unlike last night's late-night push. But if you are out of school. I think the bug is fixed by #28. The original statement by AdvantageKit was if size()>0. Size should not be used this way, and we changed it to isEmpty(). You can manually add it; you need a ! added.

Also, if there is anything you can think of that would be good to add, let us know, or if you have stuff you want to add/change/cleanup, feel free to add a Pull Request.

Finally, best of luck to you this season! What team are you on, if you don't mind me asking?

AydenLo17 commented 8 months ago

I will go test here in like 20 minutes with your update where you deleted the exclamation mark. Good things to add would be the speaking aligning which you already added, and quick paths to the different substations. Also, if teams wanted to add a 2nd or 3rd limelight I'd put code for that.

AydenLo17 commented 8 months ago

It works! It drives field oriented and the speaker aligning works, I just gotta tune that pid because it jumps left and right of aligning

JosephTLockwood commented 8 months ago

Great! Okay, that sounds good. Figured PID might be a little rough. I did add code last night to drive to amp as an example if you didn't see that. It just uses PathPlanner on the fly. I can add a line of code for multiple cameras. It should be as easy as changing.

aprilTagVision = new AprilTagVision(new AprilTagVisionIOLimelight("limelight"));

to

aprilTagVision = new AprilTagVision(new AprilTagVisionIOLimelight("limelight"), new AprilTagVisionIOLimelight("limelight2"));

Of course, you would need to change the names as needed.

AydenLo17 commented 8 months ago

Okay, thank you! Where are the pid values for pointing towards the speaker? Also, our drive is being weird. We have to invert the FL and RL drive motors every time we deploy code in Phoenix tuner X, is there somewhere where i can invert them in the code to make it easier?

JosephTLockwood commented 8 months ago

PID values are here https://github.com/Hemlock5712/AdvantageKitSwerveTemplate/blob/1cd1c7204f41f77817c3483ad65b4630232cea91/src/main/java/frc/robot/subsystems/drive/DriveConstants.java#L125 What do you mean by invert? Like they drive in the wrong direction? So forwards are backward?

AydenLo17 commented 8 months ago

Yes, when i drive forward, the right drive motors drive in forward and left drive motors drive in reverse

JosephTLockwood commented 8 months ago

Change the encoder offset in ModuleConfig.

Rotation2d.fromRotations(yourValueThatIsBackwardsFromTunerX).plus(Rotation2d.fromDegrees(180));
AydenLo17 commented 8 months ago

Duh 🤦‍♂️ that's fixed now, and am now currently adjusting pid for speaker alignment. Do you have it so the 2d pose gets updated on the start of an auton? It would appear that on advantagescope the poseestimator is right but the robot is way off away from where it really is.

JosephTLockwood commented 8 months ago

Yes, that is correct. When you turn on the robot, the robot should go to 0,0. We use a PathPlanner so we feed our Drive pose (just wheels and gyro) and our PoseEstimation (Drivetrain Pose+Vision Measurements) and set both to the starting pose of PathPlanner. We reset both so then we can track how far apart they get during the match and can determine if we need to trust vision more or less. When viewing in AdvantageScope, I would have the following 3D view

  1. PoseEstimation as the solid robot
  2. Drive as the yellow ghost
  3. Vision as the green ghost (should be under Inst0 -> Robot3DPose or something close can't remember of the top of my head)

If you want to set it up so you can compare drive and pose like that, along with a button to reset so you don't have to run an auto to reset, you can see an example of what to change here. You would want to change the Pose2D to match where you are actually starting on the field.

JosephTLockwood commented 8 months ago

I can merge that if you think it would be useful. Also, let me know if that doesn't make sense or if you have questions. I can try and explain it better, or maybe Josh can explain it better than me.

AydenLo17 commented 8 months ago

Gotcha, that makes sense. I currently have drive as the solid, and pose as the green. solid is starting at 0,0 and is in bad spot, while poseestimation is pretty close to where it is on the field. It was really easy to add that second limelight!

AydenLo17 commented 8 months ago

For your go to amp, are you using the pathfinding with the navigation grid?

JosephTLockwood commented 8 months ago

PoseEstimation is what matters (hence why I like it solid). That is where you're robot thinks it is and what it uses for all path-following stuff. As long as PoseEstimation is good you are good.

Yes, we are using the pathfinding with the navigation grid.

AydenLo17 commented 8 months ago

Gotcha, yeah it seems to be really accurate. So when you say to name the limelight properly, how do I name each limelight?

JosephTLockwood commented 8 months ago

You can name them however you want. You should be able to configure the name of the Limelight name with the Limelight Finder tool (I think???). Just make sure the string you are passing in matches the name of the Limelights.

AydenLo17 commented 8 months ago

Gotcha, the one I had on there (“limelight”) worked but I couldn’t get the second one to so I’ll check them names on them in tomorrow's meeting. Also, when holding the button for go to amp it wasn’t pathfinding, it would just keep moving with the previous joystick command I gave it until I let go. (Ex if I’m driving forward half speed and hold Y it’ll keep moving that way until I let go)

JosephTLockwood commented 8 months ago

It doesn't look like you have a fork. You will need to pull in the new LocalADStarAK. https://github.com/Hemlock5712/AdvantageKitSwerveTemplate/blob/main/src/main/java/frc/robot/util/LocalADStarAK.java

AydenLo17 commented 8 months ago

Oh yeah, I forgot about that. Don't you also have to select which pathfinder you want to use in robot.Java?

JosephTLockwood commented 8 months ago

No? I am not sure what you mean.

AydenLo17 commented 8 months ago

image

JosephTLockwood commented 8 months ago

Yes, you are right. I'll make that change now!!!

AydenLo17 commented 8 months ago

Glad I could help! During our autos the robot uses only the poseestimators from vision, right? When the robot had a blind spot it was like 2 feet off from where it should be. Possibly we just need to add more limelight's?

JosephTLockwood commented 8 months ago

Thanks for that. It is done. I totally missed that thought. It could be anywhere.

JosephTLockwood commented 8 months ago

PoseEstimator uses vision and Drivetrain. I'm shocked that it got two feet off. It is possible that you may just need to add more Limelights. I would verify that the calibration of your camera is good and that the position of your camera is set correctly on the device. A good test would be to place your robot at a known location. You may find that your Apriltags, Camera or something else is slightly off.

JosephTLockwood commented 8 months ago

So when you have a blind spot with no vision it uses drivetrain only.

AydenLo17 commented 8 months ago

Okay that's how I thought it worked. For example, putting it center on one of the note tape spots it was on advantagescope not even on the tape spot. Any recommendations for good camera placement? This is BTR's first year doing vision stuff, so we don't know but want to use it this season. We saw you at IRI this past season and were impressed with how you performed

JosephTLockwood commented 8 months ago

Yeah, I would check calibration, April tags placement, or even game piece placement (we have had that happen before). We have yet to pick a spot for ourselves right now. The only rule of thumb that most people seem to stress is to try to have the cameras ideally angled up or down at the tags. It is easier to angle up, so most teams angle them up. Mechanical Advantage looks like they are doing them off their swerve modules.

Also, thanks for the kind words about IRI!!! We were slightly disappointed in our performance this past year, though we hope to return this year! In 2022, we did way better than we had expected, so it leveled itself out lol. Hopefully, we will get to see you guys down the road!!!

AydenLo17 commented 8 months ago

Okay, I have a few good spots to put them by the swerve drive modules. Have you always used the limelight or have you looked into a system like what MA uses?

JosephTLockwood commented 8 months ago

We have used Limelight since it came out. So for a long time. Last year, Limelight didn't release anything early, and we wanted to write our own on-the-fly path following stuff (before it was cool, haha). We tried PhotonVision and had tons of issues during last season, so we swore we would return to Limelight. PhotonVision has had many updates, so I wouldn't mind trying it, but I am still slightly scared. Their number of users has also ballooned. It's so much more tested than we were using it. I mean, we were like the alpha tester. Everything was brand new, never tested. So my guess is most of the stuff by now is ironed out. The MA system looks nice, but I have heard from teams that tried it that it is challenging and offers little of a performance boost over PhotonVision and Limelight.

We will go with Limelight if we can. We only have one LL3, though, so we might try PhotonVisionas a backup if we can't buy anymore LL3.

AydenLo17 commented 8 months ago

I gotcha. We have 1 LL3, and like 5-6 LL2+'s. Problem though is we have 2 teams combined this year (Respawn Robotics 325 and Operation Orange 144) so we have to build 3 robots. One alphabot, one comp bot for 325, and one for 144. Luckily coding will be the same for all of them 😅

Our mentors were really against using the limelight due to their cost, but explained to them that it's our only current option as of now.

JosephTLockwood commented 8 months ago

Yeah, give the 2+ a shot. It might be good enough. I would first get just your three working, though. If you can get it working well would for sure help justify the cost! Robots are super expensive and I get wanting to save cost. However, if you can do an extra game piece or two in auto because of it and two or three more in tele because of it how much would a mentor pay to buy an intake that could give them that type of performance boost. Probably more than the LL3 cost. Need to make sure you get it down though vision is hard. As I said earlier we really struggled with it and it probably slowed us down and hurt us especially early in the season.

AydenLo17 commented 8 months ago

I never thought of it that way, I will be sure to bring that up in tomorrows meeting. Speaking of notes, do you know how to do note detection and make the robot point/drive to the note and acquire it?

JosephTLockwood commented 8 months ago

Yeah that is my logic granted I am a software guy. I always dislike the fact when teams are spending tens of thousands of dollars to build robots and attend events and won't buy there coders anything. I know one team that last year wouldn't get their kid a $200 dollar super nice solution so he could try and run PhotonVision.

As far as game piece tracking. I am not sure if we will do game piece tracking or not. So not sure if I will add an example or not. If I were to approach it I would just use the color filter. Object detection with a google coral is a waste of time (in this case in my opinion). These game pieces are so bright. The advantage to object detection is to detect things that aren't obvious. I would read the Limelight documentation. They have really good examples. As I said I would read them all won't take you long but the two I would focus on it. https://docs.limelightvision.io/docs/docs-limelight/getting-started/programming https://docs.limelightvision.io/docs/docs-limelight/pipeline-retro/retro-theory#from-pixels-to-angles

The one is more complicated and tells you actually how to calculate distance and angle. The other is just simple values from Limelight. I like simple solutions so I would probably start with that. Just have it turn so the center of the object (tx) is in the middle of your screen and drive forward until ty is a specific value and that should get it for you. The more advanced way could be cool because you could actually calculate in the 3D world where the game piece is. You then could use some drive to point code to drive to it (similar to the heading rotation only you would control the other controllers as well).

A LL2+ would be more than enough for this.

AydenLo17 commented 8 months ago

Fixed Code Format.txt

Good to know, I will look into those and experiment with it and send you the working code. I have 2 challenges for you

  1. Be able to know distance of robot to its alliances speaker to calculate shooter speed

  2. Use This pathfinding command. Only problem i'm having with it is that I don't know how to flip the pose2d location when you're on the other alliance. it can pathfind from anywhere on the field and avoid areas you mark on the Navigation Grid.

// ================================================ // DRIVER CONTROLLER - DPADDOWN // AUTO DRIVE TO SHOOTPOS1 // ================================================ controller .povDown() .onTrue( AutoBuilder.pathfindToPose( new Pose2d(2.954, 3.621, Rotation2d.fromRadians(2.617)), new PathConstraints( 4.0, 4.0, Units.degreesToRadians(360), Units.degreesToRadians(540)), 0, 2.0));

AydenLo17 commented 8 months ago

And use this ADStarAK File LocalADStarAK.zip :

JosephTLockwood commented 8 months ago

Yeah, you will want to use commands for both. Here is a branch with examples of how you would want to implement both. https://github.com/Hemlock5712/AdvantageKitSwerveTemplate/compare/main...AutoComp

I recommend you become familiar with WPI https://docs.wpilib.org/en/stable/docs/software/commandbased/commands.html and also their geometry stuff https://docs.wpilib.org/en/stable/docs/software/advanced-controls/geometry/pose.html

AydenLo17 commented 8 months ago

What exactly does the SetAutoStartPose do? That way I can create comments explaining what each thing does for my team to understand.

AydenLo17 commented 8 months ago

I found my problem with the pathfinding thing. I had my acceleration and velocity set to 0.5 for slow testing, and well, I wasn't holding the button. It Pathfinds to the paths for amp, and also Pathfinds to the pose for shooting. Why do you recommend using paths instead of poses for final aligning?

JosephTLockwood commented 8 months ago

So for your first question

SetAutoStartPose -> It resets the Drive and PoseEstimation to the same starting location. This way, you can see how vision affects where your robot thinks it is.

Second question Why do I recommend using paths -> I think paths are the way to go because you can program in full autos. You then have full and finer control over your robot once it hits the start pose of the path. It isn't that different from pathfindToPose. The difference is that you are driving to a pose that isn't your final destination and then running an auto routine. It makes it much easier and more consistent than driving to a pose and trying to do things once you are there. With this, you hit a pose and can tune when your arm goes up, when to start shooting motors, and when to trigger shooters. Using pathfindToPose you have to do everything after you get there.

AydenLo17 commented 8 months ago

Gotcha, that makes a lot more sense. Thank you for clarifying that!

AydenLo17 commented 8 months ago

So we got our alphabot base done. We use Neo vortexes with the mk4i's and cancoders. Do you have a moduleiosparkflex that uses the cancoders? We tried to make one but both our front wheel offsets were turned to the left like 45 degrees.