You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for this package, as well as, the whole time series ecosystem you are building!.
Is it a way to get the fitted values when training a model (on training data) in a similar way as with forecast package ?.
And evaluate the error?. You can get test error, but sorry I could not find the way to get that for the train set.
I am looking for a way to asses if the model is overfitted.
Thanks!
Carlos.
The text was updated successfully, but these errors were encountered:
Hi @coforfe , thanks for the kind words. The ecosystem is really coming together.
For GluonTS models, we don't store the fitted values because they aren't stored in the GluonTS model objects.
I don't know if you can predict backwards. I have to see an implementation where the gluonts model predicts the training data. If this is possible, we can then include fitted data inside the modeltime objects.
Well, in the same direction you mention, it is possible to run several models with different lookback_length values. It is not optimum, but at least you have some sense of the level of error.
Hi Matt,
Thanks for this package, as well as, the whole time series ecosystem you are building!.
Is it a way to get the fitted values when training a model (on training data) in a similar way as with forecast package ?.
And evaluate the error?. You can get test error, but sorry I could not find the way to get that for the train set.
I am looking for a way to asses if the model is overfitted.
Thanks!
Carlos.
The text was updated successfully, but these errors were encountered: