You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm now training a HDP model and report perplexity and loglikelihood for every 200 iters.
But when the number of iteration reaches 36600(as presented in pic), the both metrics retrun Nan.
I noted that this bug has been fixed. So what's wrong with the training?
By the way, I'm wondering what's the apt standard to choose model from perplexity(or LL per word)?
Thank you!
The text was updated successfully, but these errors were encountered:
LiuTaiShi
changed the title
When Training HDP model the perplexity return Nan。
When Training HDP model perplexity and return Nan
Nov 21, 2022
LiuTaiShi
changed the title
When Training HDP model perplexity and return Nan
When Training HDP model perplexity and LL per word return Nan
Nov 21, 2022
Hi @bab2min,
I'm now training a HDP model and report perplexity and loglikelihood for every 200 iters.
But when the number of iteration reaches 36600(as presented in pic), the both metrics retrun Nan.
I noted that this bug has been fixed. So what's wrong with the training?
By the way, I'm wondering what's the apt standard to choose model from perplexity(or LL per word)?
Thank you!
The text was updated successfully, but these errors were encountered: