-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Clean up logging #10
Comments
If I may, in this context I would suggest downgrading the Line 59 in 3009307
logging.debug , such that the INFO logging level prints more manageable periodic progress info.
Also, thanks for developing this implementation! @cmbant and I are interfacing it in our sampler, and it appears to work well for our problems (though any progress on #3 would be very welcomed!) |
Hi Jesus - I agree with this (currently too much output is displayed). I'm glad it works well for you - pending an improved handling of correlations (I'm finishing my phd thesis at the moment, but hope to look into this later in the year) I'm planning on doing something similar to Matlab's optimisation routines (e.g. fmincon), where at each iteration they print out one line with a couple of key variables (current best value, # evals, optimality measure, ...). |
Good luck with the thesis!
Not sure of what "iteration" means in the sentence, but in general differentiating between one logging level (say I would also recommend you, if I may, to declare a logger and call its logging methods, as described here as opposed to calling |
I can see that you are now allowing for more Also, you may want to consider creating a module-level logger with Thanks for the great work! |
@JesusTorrado - I have (finally!) added module-level logging, thanks for this suggestion. At this stage I'm reluctant to change the levels for each message, but you should be able to deactivate the INFO message for each evaluation (which are the only log messages produced by pybobyqa.util) |
That's brilliant, thanks a lot! I'll update the pybobyqa version in our next release. Mind that my suggestion was not necessarily about changing the info logging level, but about having 2 separate levels, whichever they are: one that prints once per target evaluation, and another one that prints progress information only sparsely (e.g. when specific values of the target metric are reached). |
Use nicer logging (more like fmincon) with one line printed per iteration
The text was updated successfully, but these errors were encountered: