Skip to content
This repository has been archived by the owner on Feb 14, 2023. It is now read-only.

Cargo tracker #92

Open
wants to merge 7 commits into
base: main
Choose a base branch
from
Open

Cargo tracker #92

wants to merge 7 commits into from

Conversation

varun7654
Copy link
Collaborator

@varun7654 varun7654 commented Aug 29, 2022

Create a system to track the position of cargo

https://gist.github.com/varun7654/3adedeabc6b6b8d2e3db46b01a2d6266

@NotNull AxisAngle4d robotRotation, @NotNull Vector3d cameraPosition,
@NotNull ArrayList<CargoPosition> cargoPositions) {
for (Vector3d detectedCargoPosition : detectedCargoPositions) {
INTAKE_CAMERA_ROTATION.transform(robotRotation.transform(detectedCargoPosition)).add(cameraPosition);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is detected cargo position in the intake camera reference frame? if that is true shouldnt you be applying these in reverse order to get the field relative position:
cargo_wrt_field = robotRotation ^-1.transform(intake_camera_rotation^-1.transform(detectedCargoPosition) - cameraPosition)

maybe i am misunderstanding what is going on here

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does the order we apply the rotations really matter? They're on their own axis's. Also, we shouldn't we add the cameraPosition to make the position relative to the field? (I changed the name of cameraPosition to cameraPositionRelativeToField to make it clearer what it is.)

I don't think we want to find the inverse of the robot rotation, right?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants