Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VariableNodeData discussion for upgrading to Manifolds.jl #763

Open
Affie opened this issue May 17, 2021 · 7 comments
Open

VariableNodeData discussion for upgrading to Manifolds.jl #763

Affie opened this issue May 17, 2021 · 7 comments
Labels
Milestone

Comments

@Affie
Copy link
Member

Affie commented May 17, 2021

To start a discussion on how best to deprecate the val and bw types of Array{Float64,2} to a vector of type P where P is the type of a point on a manifold. Or possibly something different?

Serialization should also be considered.

A possible way is to use a Union as a temporary transition (although it might be slower) and then change to a type

mutable struct VariableNodeData{T<:InferenceVariable}
    """#TODO will become Vector of a point on the manifold, 
          but should be backward compatible with Array{Float64,2}"""
    val::Union{<:AbstractVector, Matrix{Float64}}
    """#TODO will become Vector of a bandwidth/covariance of a point, 
          but should be backward compatible with Array{Float64,2}"""
    bw::Union{<:AbstractVector, Matrix{Float64}}
    ...
end

And then it can become:

mutable struct VariableNodeData{T<:InferenceVariable, P, B}
    "Vector of a point on the manifold"
    val::Vector{P}
    "Vector of a bandwith/covariance of a point"
    bw::Vector{B}
    ...
end

We can also 2 more fields and then just swop over from the one to the other.

I looked at TensorCast, but the Manifold points do not fit in with the Array{Float64,2} e.g. SO(2) is a Vector of 2x2 matrices.

@Affie Affie added types design designdecision VariableType This used to be called softtype (#603) manifolds labels May 17, 2021
@Affie Affie added this to the v0.x.0 milestone May 17, 2021
@dehann
Copy link
Member

dehann commented May 17, 2021

xref JuliaRobotics/IncrementalInference.jl#1242

P is the type of a point on a manifold. Or possibly something different?

I agree, storing some <:Vector of points p::P each being of <:ManifoldsBase.

I think the Union approach is probably easiest -- that way we will hopefully get all the errors for any lingering usage of .val. we should probably do the warn once on getfield think again to be sure.

I'm less confident about what is going to happen to the .bw field as we progress -- going with ::Vector should be good enough to make the transition...

TensorCast

that comment was just for convenience, the packages do not have much to do with each other at this point in time. (tensors and manifold operations are friends though...)

@mateuszbaran
Copy link

Instead of Vector of points you may also consider using our power manifold: https://juliamanifolds.github.io/Manifolds.jl/stable/manifolds/power.html . It has the benefit of directly supporting, for example, a vector of points on SO(2) as a 2x2xN matrix, so you can use that with TensorCast, LoopVectorization or other generic libraries.

@kellertuer
Copy link

I like that you use p::P in consistency with ManifoldsBase, this keeps it quite general.
I second the usage of the PowerManifold for performance reasons.

I am not yet sure what the bws will actually store.

@Affie
Copy link
Member Author

Affie commented May 18, 2021

Just as background: val and bw are currenly used to store the kernels. From my understanding, bw (bandwidth) will be a tangent vector X on the manifold for points stored in val (maybe X[i] for val[i] but its only one for non-parametric currently), but I might be completely wrong @dehann? For parametric bw will be the full covariance matrix.
This structure is an integral part and has to be deprecated cleanly if we don't want to spend months fixing bugs and break user code.

Instead of Vector of points you may also consider using our power manifold: https://juliamanifolds.github.io/Manifolds.jl/stable/manifolds/power.html . It has the benefit of directly supporting, for example, a vector of points on SO(2) as a 2x2xN matrix, so you can use that with TensorCast, LoopVectorization or other generic libraries.

This is good to know thanks. I'll look into it some more. It will probably come in very handy with the parametric batch solution where I currently calculate the cost function based on all variables: https://github.com/JuliaRobotics/IncrementalInference.jl/blob/f37ef01755cf5b74c6cc062426d0cdc1de5d8307/src/ParametricUtils.jl#L195-L255

@mateuszbaran
Copy link

Oh, I see you need the Hessian of the cost function at the minimizer, this will be fun on manifolds.

@kellertuer
Copy link

You can approximate the Hessian quite well, see the quasi newton algorithm. I am currently working with a student on making this a little more generic, i.e. if you can provide the gradient, you can use a (L-)BFGS-type approximate Hessian wherever you want.

@Affie
Copy link
Member Author

Affie commented May 18, 2021

We use the Hessian to estimate the covariance so will look into the quasi newton algorithm, thanks. Perhaps there are other techniques to estimate the covariance also available. I'm still a bit far from that point though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants