-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about detach
#30
Comments
Hi, we would like the gradient of losses to flow to |
@hyz-xmaster Thanks for your prompt reply. |
@hyz-xmaster any update? |
As I see in the QFL implementation, it uses |
@hyz-xmaster Exactly, I see that QFL does not and should not propagate gradients to |
See this line, we detach the We only propagate gradients to |
@zen-d |
Hi,
Thanks for the nice work. For calculating the IoU target, I think the
detach
should be used on the "predicted items". Specifically, in my view, for these two lines with detach applied, the detach should be moved to their previous lines, i.e. line 408 and 424. Correct me if I miss something.https://github.com/hyz-xmaster/VarifocalNet/blob/master/mmdet/models/dense_heads/vfnet_head.py#L409
https://github.com/hyz-xmaster/VarifocalNet/blob/master/mmdet/models/dense_heads/vfnet_head.py#L425
Could you please explain it more? Thanks.
The text was updated successfully, but these errors were encountered: