-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unbounded poisson solve 3d #135
Conversation
@fankiat I feel it should not be called |
Hmm.. That's one way to go about it I guess. Ok then, we can look into this in a separate PR after this PR is merged. Maybe we open an issue for the refactoring and I can take care of that issue accordingly. Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
minor.
sopht_mpi/numeric/eulerian_grid_ops/poisson_solver_3d/UnboundedPoissonSolverMPI3D.py
Outdated
Show resolved
Hide resolved
...umeric/test_eulerian_grid_ops/test_poisson_solver_3d/test_unbounded_poisson_solver_mpi_3d.py
Outdated
Show resolved
Hide resolved
...umeric/test_eulerian_grid_ops/test_poisson_solver_3d/test_unbounded_poisson_solver_mpi_3d.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. @fankiat after this make sure you open an issue called Road to Graduation
or so, and start listing the functionalities for scaling, running high res, global profiling, etc. to prioritize accordingly.
Fixes #133 , merge after #132 .
@bhosale2 in this PR the
MPIDomainDoublingCommunicator3D
(currently named) is actually generic for both 2d and 3d in its current form now. In other words, that same communicator can be used forUnboundedPoissonSolverMPI2D
. However, the communicator includes a few abstractions, which may or may not be straightforward, but I've placed some comments on some of the steps used. Nonetheless, there are a few things I need a second opinion on.mpi_utils.py
,mpi_utils_2d.py
, andmpi_utils_3d.py
?mpi_utils.py
?The above points of course only make sense (somewhat) to discuss further if we decide to actually move the domain doubling communicator to a more appropriate place. If we keep the domain doubling communicators separated in 2d and 3d, we can probably keep the code in this PR as is, and keep the communicator in the 2d poisson solver as a simpler code for reference (where the communicator is somewhat hardcoded and assumes only slab decomposition). Finally, depending on what we do here, I will come up with some test cases for the communicator, which I believe works correctly, given the poisson solver gives the right results.