Sequential Sampling Models (e.g., DDM, LBA) in RxInfer? #143
Replies: 4 comments 8 replies
-
Thanks for your kind words! @bvdmitri has indeed done a tremendous job with RxInfer.jl! At the moment, implementing this exact model you've provided is a challenge, but it's certainly possible. Let me explain why it's challenging for now. As you've heard in @bvdmitri's talk, RxInfer.jl maps a probabilistic model into a factor graph. Every distribution you've used will have a corresponding node in the graph, e.g., MvNormal, truncated Gaussians, Uniform, and LBA. However, ReactiveMP.jl (the inference engine of RxInfer.jl) does not have truncated distributions and certainly doesn't have an LBA node. Besides, RxInfer.jl expects that every node in the graph would "know" how to communicate with the rest of the graph by sending so-called messages (updates). RxInfer.jl must also know how to compute the posterior resulting from these messages' collision. This requires some mathematical derivations, which can be challenging as they involve intractable integrals. You can find an example of how to derive the rules here. Now, there are a few solutions that come to my mind on how to run the inference in this model without much coding and math involvement :) One approach is to cast every distribution into Gaussians and use a deterministic node wrapper over your LBA model. However, this is not ideal, as LBA is a distribution by itself and not deterministic by nature. @bvdmitri and @ismailsenoz are working on implementing an inference solution for distributions of non-exponential families, which could help in this case. Alternatively, you can use amortization techniques to approximate LBA with Normalizing Flows and then use this approximated density inside the model. I believe @itsdfish had a similar problem here, but it might get complex too. I am not an expert on density estimation problems, so I can only offer limited help with this approach. P.S. We are working hard now on RxInfer 3.0, which will make the inference for this type of model available almost out of the box. |
Beta Was this translation helpful? Give feedback.
-
@albertpod, that sounds like a great development! I look forward to trying the new version once it is available. Thanks again for your help and patience. |
Beta Was this translation helpful? Give feedback.
-
I just wanted to give you a quick heads-up on this discussion. @itsdfish we will be trying out models with intractable likelihoods. Hopefully, I can tag you once we start filing PRs :) |
Beta Was this translation helpful? Give feedback.
-
Excited to see the great work done on the v4 version to streamline the syntax. |
Beta Was this translation helpful? Give feedback.
-
In psychology/neuroscience, sequential sampling models (such as drift diffusion, linear ballistic accumulators, etc.) are in theory very useful and desired, but in practice, there is an accessibility gap which prevents many researchers to actually use them; they are notoriously hard to implement, be it with stan, R, or Python.
The SSM package in Julia is one of the latest key steps towards making them more accessible, implementing many of the distributions in Julia. It is useable with Turing, but it is very slow, especially for hierarchical models (with random factors).
Here is a small example of a Turing LBA:
If RxInfer was able to fit these types of models (faster than Turing), it would be a real game changer for the field of neuro/psych.
But is it feasible? If not, what would it require?
Thanks a lot!
PS: I just watched your talk, great stuff 👍
Beta Was this translation helpful? Give feedback.
All reactions