-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rework Weighted Particles #31
Comments
I am quite unhappy with how weighted particles work at the moment and was actually considering removing support for them, keeping only the weight-manipulating functions. You have probably noticed how the use of weighted particles causes quite a lot of warnings of poor support... Certainly, the current implementation carries around a lot of redundant weights and some form of weight sharing is absolutely required. Your proposed approach above seems interesting. I can't really visualize now how the details would look, but it could work. Possibly, the struct WeightedParticles{T,N} <: AbstractParticles{T,N}
particles::Vector{T} could be of type |
In that case, maybe it would make sense to remove them from master and add a branch to prototype an alternative? I've got a sprint coming up for the JuliaCon proceedings deadline, but after a couple of weeks things will clear back up, and I'd love to help get something more robust in place for this |
There's now a PR that removes support for WeightedParticles until they are better supported |
Hi @baggepinnen , I was thinking some more a bout this today... I think we could do something like struct Weights{T,N}
vals::Vector{T}
members :: MutableLinkedList{WeightedParticles{T,N}}
end
struct WeightedParticles{T,N} <: AbstractParticles{T,N}
particles::Vector{T}
weights::Weights{T}
function WeightedParticles(x,w)
x = Particles(x).particles
N = length(x)
w = Weights(zeros(N), MutableLinkedList{WeightedParticles{T,N}}())
wp = WeightedParticles{eltype(x),N}(x,w)
push!(w.members, wp)
end
end
julia> x = WeightedParticles(Particles().particles, ones(2000))
WeightedParticles{Float64,Array{Float64,1},2000}
1.77636e-18 ± 1.0
julia> y = WeightedParticles(Particles(), x.weights)
WeightedParticles{Float64,Array{Float64,1},2000}
1.24345e-17 ± 1.0
julia> x.weights[1] = 2.0
2.0
julia> y.weights[1]
2.0 All the stats need to be reworked for the weighted case, but hopefully this gets the idea across. And I guess you might have some old code for all of those. |
Hi Chad, your proposed approach looks much better than what I had before. Let's see if I have understood it properly The I did have some functionality in place to operate on weighted particles, like |
Right. I'd think the multivariate-ness of it would usually be implicit, the same way
I'd think it would be just like |
For a moment I thought this might be entirely wrong, but I'm pretty sure this is correct. In particle filtering I've done (maybe there are variations?) each particle maintains a state of the "world", together with a weight (usually as a log). With Maybe it would be helpful to work through an importance sampling example, like sampling from a Student's T and weighting it to act like a Gaussian, then going through some elementwise operations. Since we know the right answers, this could also be a good basis for tests. |
Ok here's an example. I think all of this checks out: using MonteCarloMeasurements
using Distributions
# Expected value of x log-weighted by ℓ
function expect(x,ℓ)
w = exp(ℓ - maximum(ℓ))
mean(x*w) / mean(w)
end
x = Particles(5000,TDist(3))
ℓ = logpdf(Normal(),x) - logpdf(TDist(3), x)
expect(x,ℓ)
expect(x^2,ℓ)
expect(x^3,ℓ)
expect(x^4,ℓ)
# Now sample z ~ Normal()
z = Particles(5000,Normal())
expect(z,ℓ)
expect(z^2,ℓ)
expect(z^3,ℓ)
expect(z^4,ℓ)
# And now a new weighted variable
y = Particles(5000,TDist(3))
ℓ += logpdf(Normal(),y) - logpdf(TDist(3), y)
expect(y,ℓ)
expect(y^2,ℓ)
expect(y^3,ℓ)
expect(y^4,ℓ)
expect(x+y,ℓ)
expect((x+y)^2,ℓ)
expect(x+z,ℓ)
expect((x+z)^2,ℓ) |
I've been thinking about about the approach for
WeightedParticles
. When I've needed this I haven't been able to use the built-in type. As it stands, the weighting vector is per-Particles
, but I've needed a shared weighting vector across severalParticles
.Is the non-sharing an intentional design choice, or are you open to this sharing idea? I think it would look something like this:
Functions like
+
,*
, etc onWeightedParticles
would require===
on the logweights, to be sure that the entire system evolves as one. Thenext
function is mostly for resampling. The idea is that when aresample
is triggered, existing particles become "stale" and need to pass through this function to reindex the values.I think there are a few ways of implementing this last bit, but I wanted to get your take on the shared weights idea first before digging into it much more.
The text was updated successfully, but these errors were encountered: