You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am kind of confused of the ensure_shared_grads here https://github.com/ikostrikov/pytorch-a3c/blob/master/train.py#L13. Here, the grad is synced only when it is None. I think we need to set shared_param._grad = param.grad all the time because I don't see we sync the grad anywhere except here. Would anyone give me some hints about it?
The text was updated successfully, but these errors were encountered:
When you shared the model parameters grad attribute is not shared, so each process needs to have their own grad attribute and that's why we only need to assign it once for every process.
I am kind of confused of the ensure_shared_grads here https://github.com/ikostrikov/pytorch-a3c/blob/master/train.py#L13. Here, the
grad
is synced only when it isNone
. I think we need to setshared_param._grad = param.grad
all the time because I don't see we sync thegrad
anywhere except here. Would anyone give me some hints about it?The text was updated successfully, but these errors were encountered: