-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What is the env params of testing or validating images? #6
Comments
Hi. Thank you for the kind words! You can change environments used for testing with Default value for the environment is taken from one of the runs of the method on our data. When we used completely random inialisation values, the model often diverged. Using these coefficients instead resulted in more consistent and better training results on the tested scenes. Coincidentally, they are also used for rendering views where no other envmap is found, which are validation and test views (when To use an external LDR/HDR envmap, you would first need to convert it to SH coefficients. The conversion will just fit the closest SH coefficients with least squares. The script for that is not yet in the repo, but I'll upload it soon, as well as the instructions on how to reproduce our numerical results from the paper. The latter involves using external environment maps and this SH conversion step too, so it should be helpful. |
Thank you for your patience in answering! Good luck with your work |
Hello, did you find where is the envmap located? |
Can this code train a set of env parameters about the dataset |
@r00tman Thanks for your great work! I can't find script converting LDR/HDR envmap to SH coefficients in your repo. Can you share me this part of code ? |
Hi!
Thanks for sharing your code and I think this is a wonderful work to solve the problems in relighting outdoor scenes by a NeRF framework.
I have an issue with the env params consisting of 9*3 variables. I clearly know the network how to process the training images and optimize the env params from default values. However, I find that the network also takes the default values as testing env params and do not optimize them. if i dont misunderstand. Is this reasonable? How do u get the default values?
Looking forward to your answer,
Thanks again!
The text was updated successfully, but these errors were encountered: