Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issues integrating on a Lie group #609

Closed
olivierverdier opened this issue May 20, 2023 · 14 comments
Closed

Issues integrating on a Lie group #609

olivierverdier opened this issue May 20, 2023 · 14 comments

Comments

@olivierverdier
Copy link
Contributor

Integrating a constant differential equation on a Lie group starting at the identity should give the same as the exponential. I have many problems trying to make sure this is the case. Did I misunderstand how integration is supposed to work? Here is a minimal working example with various configurations.

Here is the code that produces the problem, I have some questions after the code

using Manifolds
using ManifoldDiffEq
using OrdinaryDiffEq

# necessary for the SpecialEuclidean case to work?
Base.length(x::ProductRepr) = sum(map(Base.length, submanifold_components(x)))

# G = SpecialOrthogonal(4) # -> works
G = SpecialEuclidean(4) # -> wrong answer (@assert fails)
# G =  ProductGroup(ProductManifold(SpecialOrthogonal(4), TranslationGroup(3))) # -> wrong answer (@assert fails)
# G = TranslationGroup(4) # -> right answer, but expected is changed, so assert fails
# G = GeneralLinear(4) # -> doesn't work (missing method)
# G = HeisenbergGroup(4) # -> doesn't work (missing method)

M = G

action = GroupOperationAction(G)

# ---
# the type of vel differs depending on the call,
# it is either ProductRepr, or ArrayPartition (??)
# vel = rand(G; vector_at=Identity(G))
vel = rand(G; vector_at=identity_element(G))
# ---

expected = exp_lie(G, vel)

dyn = ManifoldDiffEq.LieManifoldDiffEqOperator{Float64}() do u, p, t
    return vel
end

# ---
# Same here, depending on how one initializes creates different errors
# u0 = Identity(G)
u0 = identity_element(G)
# ---

dt = .1 # should not matter at all
prob = ManifoldDiffEq.ManifoldODEProblem(dyn, u0, (0, 1.0), M)
alg = ManifoldDiffEq.ManifoldLieEuler(M, ExponentialRetraction(), action)
# higher order method should not help, and don't indeed
# alg = ManifoldDiffEq.RKMK4(M, ExponentialRetraction(), A)

# sometimes, both expected and vel are changed by calling `solve`?
@show expected
@show vel
sol = solve(prob, alg, dt=dt)
@show expected
@show vel

# expected = exp_lie(G, vel)
computed = sol(1)
@assert isapprox(G, expected, computed)
  1. Why are there two types for elements in G = SpecialEuclidean? (namely ArrayPartition and ProductRepr) For instance, rand(G) and rand(G; vector_at=Identity(G)) produce different types?
  2. Is the length function needed to solve differential equations? In that case, should it be implemented for ProductRepr or/and ArrayPartition?
  3. I'm a Julia newbie: I don't understand why a call to solve should change variable outside the scope, in particular, expected (happens for the translation group)?
  4. Even if fixing all these issues, the result seems still wrong for the SpecialEuclidean case. Am I doing something wrong?
@olivierverdier
Copy link
Contributor Author

(Answering to myself about 3.: it makes sense that expected should change in the translation case, as it is set to vel, and vel seems to be changed by solve, for some still mysterious reason)

@mateuszbaran
Copy link
Member

This is actually a very interesting example! I will answer your questions a bit out of order.

4. Even if fixing all these issues, the result seems still wrong for the SpecialEuclidean case. Am I doing something wrong?

Integration on Lie groups with metrics that are not bi-invariant can be quite confusing. In your example solver takes steps according to the metric:

alg = ManifoldDiffEq.ManifoldLieEuler(M, ExponentialRetraction(), action)

It can be changed so that the steps are taken according to the group exponential:

alg = ManifoldDiffEq.ManifoldLieEuler(M, Manifolds.GroupExponentialRetraction(), action)

This is part of the reason behind the difference. Next, Lie group solvers don't actually "go" in the direction returned by your f. The tangent vector that is returned is used to determine the step vector through apply_diff_group. I don't remember the details right now but IIRC it boils down in this case to passing it through differential of the right translation (i.e. opposite to the action). You might want to re-check the theory here because I might've made a mistake somewhere.

  1. Why are there two types for elements in G = SpecialEuclidean? (namely ArrayPartition and ProductRepr) For instance, rand(G) and rand(G; vector_at=Identity(G)) produce different types?

I made ProductRepr before I learned about ArrayPartition and haven't switched everything yet. ProductRepr is essentially semi-deprecated.

2. Is the length function needed to solve differential equations? In that case, should it be implemented for ProductRepr or/and ArrayPartition?

Some parts of OrdinaryDiffEq need length, though I don't remember what for. It is implemented for ArrayPartition by default.

I'm a Julia newbie: I don't understand why a call to solve should change variable outside the scope, in particular, expected (happens for the translation group)?

That may be my error -- ManifoldDiffEq.jl doesn't make copies for some caches. I will fix that.

@olivierverdier
Copy link
Contributor Author

olivierverdier commented May 20, 2023

Thanks for your answer.

Some background to my thinking here:

I wanted to be able to integrate the differential equation g' = ξg, g(0) = 1, on a Lie group, where ξ is a constant in the Lie algebra. How would I go about to setup such an equation on either of the groups defined in Manifolds.jl?

(Of course, I want to solve any differential equation on a Lie group, defined as g'(t) = ξ(g(t))g(t), but I wanted to check that the constant case was done right first).

The exact solution to such an equation is g(t) = exp(tξ).

Moreover, I wish to integrate using the connection defined by either one of the Cartan–Schouten family. When using any of those, any Lie group integrator should give the same result as computing the exponential of ξ directly (so, essentially the exact result). I didn't know ManifoldDiffEq.jl was choosing some other connection by default...

Some references to this point of view on integration on Lie groups (and in general homogeneous spaces): https://arxiv.org/abs/1402.6981

This being said, even using GroupExponentialRetraction does not pass the assertion when using SpecialEuclidean... I'll try to investigate on my side.

@mateuszbaran
Copy link
Member

I wanted to be able to integrate the differential equation g' = ξg, g(0) = 1, on a Lie group, where ξ is a constant in the Lie algebra. How would I go about to setup such an equation on either of the groups defined in Manifolds.jl?

Your code should work fine for bi-invariant groups, and I didn't try before solving ODEs on special euclidean so there might be some issue somewhere. Your example works on special orthogonal and that product group (though the tolerance needs to be increased a bit).

I tried writing out explicitly what should happen in a two-step forward Euler (t = 0.5), here is where I got:

G_TR = ConnectionManifold(G, CartanSchoutenZero()) # let's use Cartan-Schouten connection
u0_aff = affine_matrix(G, u0) # it's a bit easier to experiment with affine matrices
vel_aff = affine_matrix(G, vel)
a_aff = exp(G_TR, u0_aff, vel_aff/2)
a_aff = exp(G_TR, a_aff, translate_diff(G, a_aff, a_aff, vel_aff/2, LeftAction()))

After the last line a_aff should be equal to expected, and it is for special orthogonal but not for SE(n). Could you help me with an example/reference of what should be the tangent vector in the second exponential (in place of translate_diff(G, a_aff, a_aff, vel_aff/2, LeftAction()))?

@olivierverdier
Copy link
Contributor Author

Thank you for looking into this. There is probably a bug somewhere with SpecialEuclidean, or perhaps even with SemidirectProductGroup.
First, I guess it should be vel_aff = screw_matrix(G, vel)?

Then, in mathematical notations, it should just be the infinitesimal action of vel on the current point, so with matrices it is just the tangent vector vel_aff/2 * a_aff.

Actually, i find it easier in this case to work with the appropriate +/- Cartan connection (the one for which the derivative of the vector field ξg is zero in all directions), then the implementation (using connections) is exactly u1 = exp(ξ/2)u0, and u2 = exp(ξ/2)u1, where we see that u2 = exp(ξ) u0, the exact solution.

(I am pretty sure that any of the Cartan–Schouten connection should work though, since the geodesics are the same. The intermediate u1 would also be the same. Explicitly, if you used the other +/- CS connection, you would compute u2 = u1 exp(u1^{1}ξ/2 u1), which is the same as the u2 above.)

@kellertuer
Copy link
Member

HI Olivier,
cool to see that you are trying out Julia :)
I saw in your code that you are missing some functions on some manifolds?

If you could open issues for those we can check how much we can help adding them.
Usually when we start a manifold, we try to collect at least a certain amount of methods, but there might always be a few that we miss or that we do not think of.
The error message then should be an ambiguity error usually (since in the end Julia is not sure “which method to take”), but it would tell you which method is missing.

Concerning Integrators – we do not have many until now in ManifoldDiff – so if you have a cool one and you see how you can contribute said method, that would be very welcome!

Best regards from Trondheim,
Ronny

@mateuszbaran
Copy link
Member

First, I guess it should be vel_aff = **screw**_matrix(G, vel)?

Yes, you are correct, and actually this fixes my example.

By the way, I think I've identified the bug -- actually fairly easy to fix.

@olivierverdier
Copy link
Contributor Author

@mateuszbaran Yes! Well done, it seems that you have fixed the bug in e6702d6. The assertion now passes for in the SpecialEuclidean case.

The only remaining (less urgent) issue is the confusion stemming from ProductRepr. This causes issues when using ManifoldDiffEq for various reasons (length not defined being one).

Thanks again! 🤩

@olivierverdier
Copy link
Contributor Author

@kellertuer Thanks for your message! I knew this package is developed partly from NTNU. 😃

Originally, I wanted features for manifolds like getting a basis at a tangent space. I started doing that in Python before realizing that you all have got much further along those lines. This alone drove me into the Julia world!
Integrating on a Lie group is icing on the cake for me, not completely necessary. However, what should work should work, and ultimately, the current issue uncovered a bug in Manifolds.jl. 😅

I have implemented a bunch of Lie group integrators in Python in homogint (see this demo).
This package, homogint, is also a proof of concept as to how to implement Lie group integrators in a systematic and easy way (using invariant connections and a graph associated to each method).

(Other than that, there are some parts in Manifolds.jl that seem incomplete. In particular, SemidirectProduct.jl, actually... exp_lie, adjoint_action fail for lack of some unimplemented method, for instance. I implemented my own version (specialised case V^n ⋊ G, with the product representation of G on V; this could ultimately be incorporated to Manifolds.jl and potentially replace SpecialEuclidean. I'll share the code with you all when ready.)

Cheers!

@kellertuer
Copy link
Member

kellertuer commented May 21, 2023

Yes, not all methods are implemented everywhere, that is correct. We try to do that step by step, but together with proper testing and documentation, that takes some time; when at the same time none of the main developers (currently) works with the Semidirectproduct, some methods might still be missing. Open issues for missing methods already helps (pointing to where such a method could be found in literature would be awesome!) to keep track.

We are always open for PR on that – keep track of missing methods with issues and try to work through them step by step. But 3 people working on JuliaManifolds is not so many (for example compared to the near-infinite-it-seems team of geomstats), so we are not that fast in coding sometimes. We hope to be well-documented, well-tested and reasonably fast in runtime, though.
But for sure we always welcome PRs very much.

homogint sounds very promising, it would be very interesting to see whether that scheme could be realised in ManifoldsDiffEq together with DifferentialEquations.jl – though also that is not my main area I work on (that's Manopt.jl), I try to help there as well :)

edit: Oh, I just checked, currently Manifolds.jl is 31k lines of code (including documentation, excluding tests) and we run 15k tests to cover 98+% of the code. We hope that makes bugs rare – but the one you discovered (an in-place side effect) are sometimes also hard to get right :) but: Thanks for finding that, only reporting those makes code better!

@olivierverdier
Copy link
Contributor Author

Yes, not all methods are implemented everywhere, that is correct. We try to do that step by step, but together with proper testing and documentation, that takes some time; when at the same time none of the main developers (currently) works with the Semidirectproduct, some methods might still be missing. Open issues for missing methods already helps (pointing to where such a method could be found in literature would be awesome!) to keep track.

Sure! I guess a first step would be to have automated testing for all (or many) of the group configurations (arbitrary products of semidirect products etc.), and testing generic things such as exp(ad ξ) = Ad_{exp ξ), or Ad_g Ad_{g'} = Ad_{gg'}, or solving a trivial differential equation just as I did. I guess the missing methods would quickly appear this way?

(for example compared to the near-infinite-it-seems team of geomstats), so we are not that fast in coding sometimes. We hope to be well-documented, well-tested and reasonably fast in runtime, though. But for sure we always welcome PRs very much.

  1. geomstats (which I considered using before stumbling across Manifolds.jl) is nowhere nearly as useful as Manifolds.jl, it has no group models, for instance. The systematic approach of Manifolds.jl makes it a far more interesting toolbox for manifold users, IMO.
  2. To increase the number of people engaged in Manifolds.jl, it could make sense to write a massive amount of tutorials (maybe like this one?), in easy formats, such as Jupyter/Pluto notebooks. What about using the same examples as in geomstats or adding examples on robotics and more? (cf. this tomography library developed at KTH which has 50+ tutorials)

edit: Oh, I just checked, currently Manifolds.jl is 31k lines of code (including documentation, excluding tests) and we run 15k tests to cover 98+% of the code.

That is remarkable indeed. 👍

We hope that makes bugs rare – but the one you discovered (an in-place side effect) are sometimes also hard to get right :) but: Thanks for finding that, only reporting those makes code better!

My pleasure, thanks again to @mateuszbaran for this very quick fix!!

@kellertuer
Copy link
Member

Sure! I guess a first step would be to have automated testing for all (or many) of the group configurations (arbitrary products of semidirect products etc.), and testing generic things such as exp(ad ξ) = Ad_{exp ξ), or Ad_g Ad_{g'} = Ad_{gg'}, or solving a trivial differential equation just as I did. I guess the missing methods would quickly appear this way?

Sure – I would love to get our test suite to a bit more automated testing – and where then certain tests are disabled indicating that those methods are missing.
Somte methods might also (just) not be included since (a) no one needed them before or (most probably) (b) the person implementing said manifold / Lie group / metric / connection / ... was not aware of a formula / possible implementation at the time of writing the code.
And sure further examples always help to spot them – and then opening issues to keep track of them and slowly working to implement them all – that would be great.

  1. geomstats (which I considered using before stumbling across Manifolds.jl) is nowhere nearly as useful as Manifolds.jl, it has no group models, for instance. The systematic approach of Manifolds.jl makes it a far more interesting toolbox for manifold users, IMO.

Very kind of you to say. I personally never worked with geomstats, since I “skipped” Python and moved from Matlab directly to Julia. They do have a systematic (object oriented) way as well, as far as I am aware; though it is – I think – a quire flat hierarchy and not much focused on “features” (like whether a manifold is a Lie group).

  1. To increase the number of people engaged in Manifolds.jl, it could make sense to write a massive amount of tutorials (maybe like this one?), in easy formats, such as Jupyter/Pluto notebooks. What about using the same examples as in geomstats or adding examples on robotics and more? (cf. this tomography library developed at KTH which has 50+ tutorials)

Yes! More tutorials would be great. But those also take time. I started ManoptExamples.jl recently as a collection of (well-tested, well-documented) optimisation problems / benchmarks and according tutorials. Sadly the mentioned PR you linked seems to have been discontinued or is inactive for now for a while.
But yes – more tutorials would be great – either here in the docs or even in a separate nice small package (like the one for Manopt).
Also that is mainly a time issue – I have several ideas what to do, for sure.

@mateuszbaran
Copy link
Member

@mateuszbaran Yes! Well done, it seems that you have fixed the bug in e6702d6. The assertion now passes for in the SpecialEuclidean case.

The only remaining (less urgent) issue is the confusion stemming from ProductRepr. This causes issues when using ManifoldDiffEq for various reasons (length not defined being one).

Thanks again! star_struck

You're welcome 🙂 . I will replace ProductRepr with ArrayPartition soon.

I have implemented a bunch of Lie group integrators in Python in homogint (see this demo).
This package, homogint, is also a proof of concept as to how to implement Lie group integrators in a systematic and easy way (using invariant connections and a graph associated to each method).

Cool, I will have to take a look at your package. When I was designing ManifoldDiffEq.jl, I missed it somehow and I only knew about DiffMan. It was very difficult to design good abstractions based on papers I've been reading and I'm sure I must've missed some concepts. For example I still don't have a good understanding of isotropy.

By the way, there wasn't much activity in ManifoldDiffEq.jl recently but I'm planning to go back to it later this year, mostly with the goal of parameter estimation of ODEs/SDEs on SE(n).

(Other than that, there are some parts in Manifolds.jl that seem incomplete. In particular, SemidirectProduct.jl, actually... exp_lie, adjoint_action fail for lack of some unimplemented method, for instance. I implemented my own version (specialised case V^n ⋊ G, with the product representation of G on V; this could ultimately be incorporated to Manifolds.jl and potentially replace SpecialEuclidean. I'll share the code with you all when ready.)

Yes, we have some missing methods. We are a small team and there is a massive amount of things that could be useful, so we implement what we need or are interested in, and sometimes people ask for some functionality and we implement it. Feel free to open issues about things that are missing or incorrect 🙂 .

Sure! I guess a first step would be to have automated testing for all (or many) of the group configurations (arbitrary products of semidirect products etc.), and testing generic things such as exp(ad ξ) = Ad_{exp ξ), or Ad_g Ad_{g'} = Ad_{gg'}, or solving a trivial differential equation just as I did. I guess the missing methods would quickly appear this way?

Sure – I would love to get our test suite to a bit more automated testing – and where then certain tests are disabled indicating that those methods are missing. Somte methods might also (just) not be included since (a) no one needed them before or (most probably) (b) the person implementing said manifold / Lie group / metric / connection / ... was not aware of a formula / possible implementation at the time of writing the code. And sure further examples always help to spot them – and then opening issues to keep track of them and slowly working to implement them all – that would be great.

Yes, we are aware of a lot of missing methods but we need to prioritize different tasks. If something is nontrivial (and for example many things on semidirect product groups are nontrivial) and we don't really need it at the moment, it is often left for later. For example, I remember spending a couple of hours trying to work out a generic implementation of differential of the right translation on semidirect product groups but in the end I managed to only work it out for SE(n).

(for example compared to the near-infinite-it-seems team of geomstats), so we are not that fast in coding sometimes. We hope to be well-documented, well-tested and reasonably fast in runtime, though. But for sure we always welcome PRs very much.

  1. geomstats (which I considered using before stumbling across Manifolds.jl) is nowhere nearly as useful as Manifolds.jl, it has no group models, for instance. The systematic approach of Manifolds.jl makes it a far more interesting toolbox for manifold users, IMO.

Thank you 🙂 . I personally appreciate the work of geomstats people but I prefer Julia over Python, and our philosophy of interface design.

3. To increase the number of people engaged in Manifolds.jl, it could make sense to write a massive amount of tutorials (maybe like this one?), in easy formats, such as Jupyter/Pluto notebooks. What about using the same examples as in geomstats or adding examples on robotics and more? (cf. this tomography library developed at KTH which has 50+ tutorials)

Yes! More tutorials would be great. But those also take time. I started ManoptExamples.jl recently as a collection of (well-tested, well-documented) optimisation problems / benchmarks and according tutorials. Sadly the mentioned PR you linked seems to have been discontinued or is inactive for now for a while. But yes – more tutorials would be great – either here in the docs or even in a separate nice small package (like the one for Manopt). Also that is mainly a time issue – I have several ideas what to do, for sure.

More tutorials is a very good idea, I will try to make something when I have more time.

@olivierverdier
Copy link
Contributor Author

I will replace ProductRepr with ArrayPartition soon.

That would be awesome!

I have implemented a bunch of Lie group integrators in Python in homogint (see this demo).
This package, homogint, is also a proof of concept as to how to implement Lie group integrators in a systematic and easy way (using invariant connections and a graph associated to each method).

Cool, I will have to take a look at your package. When I was designing ManifoldDiffEq.jl, I missed it somehow and I only knew about DiffMan. It was very difficult to design good abstractions based on papers I've been reading and I'm sure I must've missed some concepts. For example I still don't have a good understanding of isotropy.

Understanding isotropy was the motivation of writing the paper Integrators on homogeneous spaces: Isotropy choice and connections. The paper explains isotropy in §5 (in particular, in most homogeneous space, there is a canonical isotropy, in one-to-one correspondence to the invariant affine connections).

The paper also shows how to put all the Lie group integrators in the same general framework in §4. This allows to implement any Lie group integrator with minimal efforts (and homogint is a proof-of-concept for that).

Feel free to open issues about things that are missing or incorrect 🙂 .

Thanks, I will!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants