Skip to content
This repository has been archived by the owner on Jul 10, 2021. It is now read-only.

Error #219

Open
jgdwyer opened this issue Sep 28, 2016 · 1 comment
Open

Error #219

jgdwyer opened this issue Sep 28, 2016 · 1 comment

Comments

@jgdwyer
Copy link

jgdwyer commented Sep 28, 2016

No description provided.

@jgdwyer
Copy link
Author

jgdwyer commented Sep 28, 2016

I noticed that when I weight my training examples non-uniformly when training a regressor type MLP, my validation error becomes an order of magnitude better than my training error. Digging into the code it looks like the loss function purposely doesn't account for the weights when calculating the validation error. Given that my stop condition for training my neural network is that the validation error is steady for some number of iterations, I would have thought the validation error should be calculating with weights. Is there a specific reason it's not?

if mode == 'train': loss += processor(Xb, yb, wb if wb is not None else 1.0) else: loss += processor(Xb, yb) count += 1

See lines 322-325 of backend/lasagne/mlp.py

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant