You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thank for you this work. I was wondering why, during training, it is necessary to take two passes through the Generator, i.e. once in line 145 and once in line 158:
Both these lines are identical (in that you are not sampling new data), and presumably fake.detach() prevents contamination of gradients between the Discriminator update and the Generator update? I think I might be misunderstanding something; if it won't take much time, could you clarify this for me? Thanks.
The text was updated successfully, but these errors were encountered:
Hi, thank for you this work. I was wondering why, during training, it is necessary to take two passes through the Generator, i.e. once in line 145 and once in line 158:
`
`
Both these lines are identical (in that you are not sampling new data), and presumably
fake.detach()
prevents contamination of gradients between the Discriminator update and the Generator update? I think I might be misunderstanding something; if it won't take much time, could you clarify this for me? Thanks.The text was updated successfully, but these errors were encountered: