Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhancement: scaling of average affine during template shape update #1225

Open
gdevenyi opened this issue Aug 13, 2021 · 3 comments
Open

Enhancement: scaling of average affine during template shape update #1225

gdevenyi opened this issue Aug 13, 2021 · 3 comments
Labels
feature request Ideas for new or improved ANTs features

Comments

@gdevenyi
Copy link
Contributor

Is your feature request related to a problem? Please describe.
Currently, the average affine is applied during shape updating, whereas for the non-linear warps are scaled.

Describe the solution you'd like
Scaled interpolation between the identity transform and the full average would allow for scaling of affine transforms during model building, with the eventual goal of a template pipeline that converges to a stable solution instead of having a fixed number of iterations.

Describe alternatives you've considered
Don't

Additional context
With the average warp, the transform is applied several times, that may need to be done here to be equivalent.

@gdevenyi
Copy link
Contributor Author

An implementation of the necessary maths is in VTK, here's an example implementation in python which could easily translate to C++

#!/usr/bin/env python

# https://vtk.org/doc/nightly/html/classvtkTransformInterpolator.html
# https://simpleitk.org/doxygen/latest/html/classitk_1_1simple_1_1Transform.html
# https://vtk.org/doc/nightly/html/classvtkMatrix4x4.html
# https://www.paraview.org/Bug/view.php?id=10102#c37133


import vtk
import SimpleITK as sitk

# Input transform
input_itk_transform = sitk.ReadTransform("trans0_GenericAffine.mat")
output_itk_transform = sitk.AffineTransform(3)
# Dump the matrix 4x3 matrix
input_itk_transform_paramters = input_itk_transform.GetParameters()

# Create VTK input/output transform, and an identity for the interpolation
input_vtk_transform = vtk.vtkTransform()
identity_vtk_transform = vtk.vtkTransform()
output_vtk_transform = vtk.vtkTransform()

# Setup the VTK transform by reconstructing from the ITK matrix parameters
input_vtk_transform.SetMatrix(
    input_itk_transform_paramters[0:3]
    + input_itk_transform_paramters[9:10]
    + input_itk_transform_paramters[3:6]
    + input_itk_transform_paramters[10:11]
    + input_itk_transform_paramters[6:9]
    + input_itk_transform_paramters[11:12]
    + (0, 0, 0, 1)
)

# Create an interpolator
vtk_transform_interpolator = vtk.vtkTransformInterpolator()

# Build an interpolation stack, identity transform and the input transform
vtk_transform_interpolator.AddTransform(0, identity_vtk_transform)
vtk_transform_interpolator.AddTransform(1, input_vtk_transform)

#Generate some steps for testing
for scale in [0.1,0.25,0.5,0.75,1]:
    # Generate a transform a fractional step between identity and the input transform
    vtk_transform_interpolator.InterpolateTransform(scale, output_vtk_transform)


    output_itk_transform.SetParameters(
        (
            output_vtk_transform.GetMatrix().GetElement(0, 0),
            output_vtk_transform.GetMatrix().GetElement(0, 1),
            output_vtk_transform.GetMatrix().GetElement(0, 2),
            output_vtk_transform.GetMatrix().GetElement(1, 0),
            output_vtk_transform.GetMatrix().GetElement(1, 1),
            output_vtk_transform.GetMatrix().GetElement(1, 2),
            output_vtk_transform.GetMatrix().GetElement(2, 0),
            output_vtk_transform.GetMatrix().GetElement(2, 1),
            output_vtk_transform.GetMatrix().GetElement(2, 2),
            output_vtk_transform.GetMatrix().GetElement(0, 3),
            output_vtk_transform.GetMatrix().GetElement(1, 3),
            output_vtk_transform.GetMatrix().GetElement(2, 3)
        )
    )


    sitk.WriteTransform(output_itk_transform, f"output{str(scale)}.mat")

@stnava
Copy link
Member

stnava commented Aug 13, 2021

@gdevenyi

overall, I agree it would be good to have this done properly. however, we piloted all this stuff "back in the day" and found it didnt matter much. here is why:

  • a Euclidean average of the matrices is a pretty good approximation as long as sample size is large enough and the distribution doesn't deviate too much from a multivariate gaussian centered at zero ( thus the need for an average template) .... also, good if tails aren't huge.

  • we are only updating the rotation based on the average difference in orientations between the template and target images --- which tends to be small when the conditions above are met. so in this case, the geodesic and crude linear approximation are nearly identical.

  • Euclidean average will fail if you have two subjects with large rotation - but our default rigid probably won't like that either

regarding the code - this is the key line:

    vtk_transform_interpolator.InterpolateTransform(scale, output_vtk_transform)

which hides some complexity.

c3d may have something that's relevant too - not sure. @pyushkevich or @cookpa may know.

given that simple itk and antspy exist - and everyone has python - it might be easiest to just write a python script that could do the required math fairly easily. but we do have some old C++ for this --written maybe by @songgang .... not sure it would be worth the archaeological efforts needed for that.

as an aside, there is not a single solution to this problem. the reason for that is that there are multiple viable distance metrics in the 3d rotation space. each one would produce a slightly different average (probably). but that's just an aside.

@gdevenyi
Copy link
Contributor Author

overall, I agree it would be good to have this done properly. however, we piloted all this stuff "back in the day" and found it didnt matter much. here is why:

a Euclidean average of the matrices is a pretty good approximation as long as sample size is large enough and the distribution doesn't deviate too much from a multivariate gaussian centered at zero ( thus the need for an average template) .... also, good if tails aren't huge.

we are only updating the rotation based on the average difference in orientations between the template and target images --- which tends to be small when the conditions above are met. so in this case, the geodesic and crude linear approximation are nearly identical.

Yeah I get that template construction has generally worked well enough in the past, but I'm still generally interested in the concept of convergence in this, something which I can't achieve right now when I've played with it.

  • but our default rigid probably won't like that either

I have experienced this

regarding the code - this is the key line

Yup, and its available in the C++ VTK, which is already an (optional) dependency of this project. I'm aware of the complexity it hides (particularly SLERP)

as an aside, there is not a single solution to this problem. the reason for that is that there are multiple viable distance metrics in the 3d rotation space. each one would produce a slightly different average (probably). but that's just an aside.

Yup this is a method I could find as already implemented, rather than writing my own. I am working on an averaging as per #1210

Overall, I'm putting this down as "I may do this, but here's the info if someone else wants to in the meantime"

@cookpa cookpa added the feature request Ideas for new or improved ANTs features label Aug 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request Ideas for new or improved ANTs features
Projects
None yet
Development

No branches or pull requests

3 participants