Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ForwardDiff.gradient fails for global map #17

Open
ggebbie opened this issue Oct 14, 2021 · 3 comments
Open

ForwardDiff.gradient fails for global map #17

ggebbie opened this issue Oct 14, 2021 · 3 comments

Comments

@ggebbie
Copy link
Owner

ggebbie commented Oct 14, 2021

The global map relies upon an optimization that uses a explicitly supplied gradient. It would be nice to check that the gradient is accurate using a comprehensive method, such as that of ForwardDiff.jl or Zygote.jl. The TMI.jl is not sufficiently generic to be used with ForwardDiff. It appears that a ForwardDiff container for the control vector cannot be handled by the TMI.jl routine. Other issues related to initialization of any real number seem to be fixed.

@ggebbie
Copy link
Owner Author

ggebbie commented Oct 15, 2021

After eliminating unnecessary type definitions in the function, and separating the constant parameters in the argument list, we are left with an error related to the matrix left divide and the sparse LU decomposition

@ggebbie
Copy link
Owner Author

ggebbie commented Oct 18, 2021

Note that ReverseDiff.jl, which is the better choice because of the large number of inputs (~2806) and small number of outputs (1), also has trouble with the sparse matrix left divide.

One workaround is the ILUZero.jl package. When differentiating a function with a sparse matrix left divide given by ILUZero.jl, I get the following error.
"""
ERROR: MethodError: no method matching adjoint(::ILU0Precon{Float64, Int64})
Closest candidates are:
adjoint(::FillArrays.Fill{T, 2, Axes} where Axes) where T at /home/gebbie/.julia/packages/FillArrays/Vzxer/src/fillalgebra.jl:16
adjoint(::LinearAlgebra.Transpose{var"#s46", var"#s45"} where {var"#s46"<:Real, var"#s45"<:Union{StaticArrays.StaticVector{N, T} where {N, T}, StaticArrays.StaticMatrix{N, M, T} where {N, M, T}}}) at /home/gebbie/.julia/packages/StaticArrays/OWJK7/src/linalg.jl:78
adjoint(::LinearAlgebra.Transpose{var"#s814", S} where {var"#s814"<:Real, S}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.6/LinearAlgebra/src/adjtrans.jl:166
...
Stacktrace:
[1] misfit_gridded_data(uvec::Vector{Float64}, Alu::ILU0Precon{Float64, Int64}, y::Array{Float64, 3}, d::Array{Float64, 3}, Wⁱ::LinearAlgebra.Diagonal{Float64, Vector{Float64}}, wet::BitArray{3})
@ TMI ~/projects/TMI.jl/src/TMI.jl:988
[2] fg(x::Vector{Float64})
@ Main ./REPL[41]:1
[3] top-level scope
@ REPL[43]:1
"""

@ggebbie
Copy link
Owner Author

ggebbie commented Oct 18, 2021

Got rid of adjoint call by defining a function to compute just the cost function value, but ForwardDiff.gradient fails at
"""
ERROR: MethodError: no method matching ldiv!(::SuiteSparse.UMFPACK.UmfpackLU{Float64, Int64}, ::Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(f_diffable), Float64}, Float64, 12}})
"""

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant