Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The Continuous Normalizing Flows example throws exception #953

Open
lanceXwq opened this issue Oct 7, 2024 · 2 comments
Open

The Continuous Normalizing Flows example throws exception #953

lanceXwq opened this issue Oct 7, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@lanceXwq
Copy link

lanceXwq commented Oct 7, 2024

Hello,

I tried the continuous normalizing flows example in the documentation, although the code kept running, it threw exceptions at

res1 = Optimization.solve(
    optprob, OptimizationOptimisers.Adam(0.01); maxiters = 20, callback = cb)

and

res2 = Optimization.solve(optprob2, Optim.LBFGS(); allow_f_increases=false, callback=cb)

Error & Stacktrace ⚠️

Complete error message
┌ Error: Exception while generating log record in module Main at /home/lancexwq/Dropbox (ASU)/Code/Julia/CNF/example.jl:21
│   exception =
│    type OptimizationState has no field layer_1
│    Stacktrace:
│      [1] getproperty
│        @ ./Base.jl:37 [inlined]
│      [2] macro expansion
│        @ ~/.julia/packages/Lux/VkHFW/src/layers/containers.jl:0 [inlined]
│      [3] applychain(layers::@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, x::SubArray{Float32, 2, Matrix{Float32}, Tuple{UnitRange{Int64}, Base.Slice{Base.OneTo{Int64}}}, false}, ps::Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, st::@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}})
Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, x::SubArray{Float32, 2, Matrix{Float32}, Tuple{UnitRange{Int64}, Base.Slice{Base.OneTo{Int64}}}, false}, ps::Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, st::@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}})
│        @ Lux ~/.julia/packages/Lux/VkHFW/src/layers/containers.jl:482
│      [4] (::Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing})(x::SubArray{Float32, 2, Matrix{Float32}, Tuple{UnitRange{Int64}, Base.Slice{Base.OneTo{Int64}}}, false}, ps::Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, st::@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}})
│        @ Lux ~/.julia/packages/Lux/VkHFW/src/layers/containers.jl:480
│      [5] apply(model::Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, x::SubArray{Float32, 2, Matrix{Float32}, Tuple{UnitRange{Int64}, Base.Slice{Base.OneTo{Int64}}}, false}, ps::Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, st::@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}})
│        @ LuxCore ~/.julia/packages/LuxCore/IBKvY/src/LuxCore.jl:155
│      [6] (::StatefulLuxLayer{Static.True, Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, @NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}}})(x::SubArray{Float32, 2, Matrix{Float32}, Tuple{UnitRange{Int64}, Base.Slice{Base.OneTo{Int64}}}, false}, p::Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}})
│        @ Lux ~/.julia/packages/Lux/VkHFW/src/helpers/stateful.jl:119
│      [7] __ffjord(model::StatefulLuxLayer{Static.True, Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, @NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}}}, u::Matrix{Float32}, p::Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, ad::AutoZygote, regularize::Bool, monte_carlo::Bool)
│        @ DiffEqFlux ~/.julia/packages/DiffEqFlux/u9oZE/src/ffjord.jl:83
│      [8] (::DiffEqFlux.var"#ffjord#11"{FFJORD{Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, AutoZygote, Tuple{Int64}, Tuple{Float32, Float32}, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}}, @Kwargs{}}, StatefulLuxLayer{Static.True, Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, @NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}}}, Bool, Bool})(u::Matrix{Float32}, p::Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, t::Float32)
│        @ DiffEqFlux ~/.julia/packages/DiffEqFlux/u9oZE/src/ffjord.jl:137
│      [9] (::ODEFunction{false, SciMLBase.AutoSpecialize, DiffEqFlux.var"#ffjord#11"{FFJORD{Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, AutoZygote, Tuple{Int64}, Tuple{Float32, Float32}, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}}, @Kwargs{}}, StatefulLuxLayer{Static.True, Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, @NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}}}, Bool, Bool}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing})(::Matrix{Float32}, ::Vararg{Any})
│        @ SciMLBase ~/.julia/packages/SciMLBase/PTTHz/src/scimlfunctions.jl:2336
│     [10] initialize!(integrator::OrdinaryDiffEqCore.ODEIntegrator{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, false, Matrix{Float32}, Nothing, Float32, Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, Float32, Float32, Float32, Float32, Vector{Matrix{Float32}}, ODESolution{Float32, 3, Vector{Matrix{Float32}}, Nothing, Nothing, Vector{Float32}, Vector{Vector{Matrix{Float32}}}, Nothing, ODEProblem{Matrix{Float32}, Tuple{Float32, Float32}, false, Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, ODEFunction{false, SciMLBase.AutoSpecialize, DiffEqFlux.var"#ffjord#11"{FFJORD{Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, AutoZygote, Tuple{Int64}, Tuple{Float32, Float32}, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}}, @Kwargs{}}, StatefulLuxLayer{Static.True, Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, @NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}}}, Bool, Bool}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, @Kwargs{}, SciMLBase.StandardODEProblem}, Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, OrdinaryDiffEqCore.InterpolationData{ODEFunction{false, SciMLBase.AutoSpecialize, DiffEqFlux.var"#ffjord#11"{FFJORD{Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, AutoZygote, Tuple{Int64}, Tuple{Float32, Float32}, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}}, @Kwargs{}}, StatefulLuxLayer{Static.True, Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, @NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}}}, Bool, Bool}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Matrix{Float32}}, Vector{Float32}, Vector{Vector{Matrix{Float32}}}, Nothing, OrdinaryDiffEqTsit5.Tsit5ConstantCache, Nothing}, SciMLBase.DEStats, Nothing, Nothing, Nothing}, ODEFunction{false, SciMLBase.AutoSpecialize, DiffEqFlux.var"#ffjord#11"{FFJORD{Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, AutoZygote, Tuple{Int64}, Tuple{Float32, Float32}, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}}, @Kwargs{}}, StatefulLuxLayer{Static.True, Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, @NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}}}, Bool, Bool}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OrdinaryDiffEqTsit5.Tsit5ConstantCache, OrdinaryDiffEqCore.DEOptions{Float32, Float32, Float32, Float32, PIController{Rational{Int64}}, typeof(DiffEqBase.ODE_DEFAULT_NORM), typeof(LinearAlgebra.opnorm), Bool, CallbackSet{Tuple{}, Tuple{}}, typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN), typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE), typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK), DataStructures.BinaryHeap{Float32, DataStructures.FasterForward}, DataStructures.BinaryHeap{Float32, DataStructures.FasterForward}, Nothing, Nothing, Int64, Tuple{}, Tuple{}, Tuple{}}, Matrix{Float32}, Float32, Nothing, OrdinaryDiffEqCore.DefaultInit, Nothing}, cache::OrdinaryDiffEqTsit5.Tsit5ConstantCache)
│        @ OrdinaryDiffEqTsit5 ~/.julia/packages/OrdinaryDiffEqTsit5/DHYtz/src/tsit_perform_step.jl:111
│     [11] __init(prob::ODEProblem{Matrix{Float32}, Tuple{Float32, Float32}, false, Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, ODEFunction{false, SciMLBase.AutoSpecialize, DiffEqFlux.var"#ffjord#11"{FFJORD{Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, AutoZygote, Tuple{Int64}, Tuple{Float32, Float32}, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}}, @Kwargs{}}, StatefulLuxLayer{Static.True, Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, @NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}}}, Bool, Bool}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, @Kwargs{}, SciMLBase.StandardODEProblem}, alg::Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, timeseries_init::Tuple{}, ts_init::Tuple{}, ks_init::Tuple{}, recompile::Type{Val{true}}; saveat::Tuple{}, tstops::Tuple{}, d_discontinuities::Tuple{}, save_idxs::Nothing, save_everystep::Bool, save_on::Bool, save_start::Bool, save_end::Bool, callback::Nothing, dense::Bool, calck::Bool, dt::Float32, dtmin::Float32, dtmax::Float32, force_dtmin::Bool, adaptive::Bool, gamma::Rational{Int64}, abstol::Nothing, reltol::Nothing, qmin::Rational{Int64}, qmax::Int64, qsteady_min::Int64, qsteady_max::Int64, beta1::Nothing, beta2::Nothing, qoldinit::Rational{Int64}, controller::Nothing, fullnormalize::Bool, failfactor::Int64, maxiters::Int64, internalnorm::typeof(DiffEqBase.ODE_DEFAULT_NORM), internalopnorm::typeof(LinearAlgebra.opnorm), isoutofdomain::typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN), unstable_check::typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK), verbose::Bool, timeseries_errors::Bool, dense_errors::Bool, advance_to_tstop::Bool, stop_at_next_tstop::Bool, initialize_save::Bool, progress::Bool, progress_steps::Int64, progress_name::String, progress_message::typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE), progress_id::Symbol, userdata::Nothing, allow_extrapolation::Bool, initialize_integrator::Bool, alias_u0::Bool, alias_du0::Bool, initializealg::OrdinaryDiffEqCore.DefaultInit, kwargs::@Kwargs{})
│        @ OrdinaryDiffEqCore ~/.julia/packages/OrdinaryDiffEqCore/55UVY/src/solve.jl:525
│     [12] __init (repeats 5 times)
│        @ ~/.julia/packages/OrdinaryDiffEqCore/55UVY/src/solve.jl:11 [inlined]
│     [13] #__solve#75
│        @ ~/.julia/packages/OrdinaryDiffEqCore/55UVY/src/solve.jl:6 [inlined]
│     [14] __solve
│        @ ~/.julia/packages/OrdinaryDiffEqCore/55UVY/src/solve.jl:1 [inlined]
│     [15] solve_call(_prob::ODEProblem{Matrix{Float32}, Tuple{Float32, Float32}, false, Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, ODEFunction{false, SciMLBase.AutoSpecialize, DiffEqFlux.var"#ffjord#11"{FFJORD{Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, AutoZygote, Tuple{Int64}, Tuple{Float32, Float32}, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}}, @Kwargs{}}, StatefulLuxLayer{Static.True, Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, @NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}}}, Bool, Bool}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, @Kwargs{}, SciMLBase.StandardODEProblem}, args::Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}; merge_callbacks::Bool, kwargshandle::Nothing, kwargs::@Kwargs{save_everystep::Bool, save_start::Bool, save_end::Bool})
│        @ DiffEqBase ~/.julia/packages/DiffEqBase/uqSeD/src/solve.jl:612
│     [16] solve_call
│        @ ~/.julia/packages/DiffEqBase/uqSeD/src/solve.jl:569 [inlined]
│     [17] #solve_up#53
│        @ ~/.julia/packages/DiffEqBase/uqSeD/src/solve.jl:1092 [inlined]
│     [18] solve_up
│        @ ~/.julia/packages/DiffEqBase/uqSeD/src/solve.jl:1078 [inlined]
│     [19] #solve#51
│        @ ~/.julia/packages/DiffEqBase/uqSeD/src/solve.jl:1015 [inlined]
│     [20] solve
│        @ ~/.julia/packages/DiffEqBase/uqSeD/src/solve.jl:1005 [inlined]
│     [21] __forward_ffjord(n::FFJORD{Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, AutoZygote, Tuple{Int64}, Tuple{Float32, Float32}, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}}, @Kwargs{}}, x::Matrix{Float32}, ps::Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, st::@NamedTuple{model::@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}}, regularize::Bool, monte_carlo::Bool})
│        @ DiffEqFlux ~/.julia/packages/DiffEqFlux/u9oZE/src/ffjord.jl:143
│     [22] FFJORD
│        @ ~/.julia/packages/DiffEqFlux/u9oZE/src/ffjord.jl:128 [inlined]
│     [23] apply
│        @ ~/.julia/packages/LuxCore/IBKvY/src/LuxCore.jl:155 [inlined]
│     [24] (::StatefulLuxLayer{Static.True, FFJORD{Chain{@NamedTuple{layer_1::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}, layer_2::Dense{typeof(tanh), Int64, Int64, Nothing, Nothing, Static.True}}, Nothing}, Nothing, AutoZygote, Tuple{Int64}, Tuple{Float32, Float32}, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}}, @Kwargs{}}, Nothing, @NamedTuple{model::@NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}}, regularize::Bool, monte_carlo::Bool}})(x::Matrix{Float32}, p::Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}})
│        @ Lux ~/.julia/packages/Lux/VkHFW/src/helpers/stateful.jl:119
│     [25] loss::Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}})
│        @ Main ~/Dropbox (ASU)/Code/Julia/CNF/example.jl:16
│     [26] macro expansion
│        @ ./logging.jl:373 [inlined]
│     [27] cb(p::Optimization.OptimizationState{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Float32, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, Nothing, Optimisers.Leaf{Optimisers.Adam, Tuple{Vector{Float32}, Vector{Float32}, Tuple{Float32, Float32}}}}, l::Float32)
│        @ Main ~/Dropbox (ASU)/Code/Julia/CNF/example.jl:21
│     [28] macro expansion
│        @ ~/.julia/packages/OptimizationOptimisers/Geiis/src/OptimizationOptimisers.jl:100 [inlined]
│     [29] macro expansion
│        @ ~/.julia/packages/Optimization/DFBOb/src/utils.jl:32 [inlined]
│     [30] __solve(cache::OptimizationCache{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#11#12", OptimizationBase.var"#grad#16"{SciMLBase.NullParameters, OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#11#12", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, AutoForwardDiff{nothing, Nothing}}, OptimizationBase.var"#fg!#18"{SciMLBase.NullParameters, OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#11#12", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, AutoForwardDiff{nothing, Nothing}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, SciMLBase.NullParameters}, Nothing, Nothing, Nothing, Nothing, Nothing, Optimisers.Adam, Bool, typeof(cb), Nothing})
│        @ OptimizationOptimisers ~/.julia/packages/OptimizationOptimisers/Geiis/src/OptimizationOptimisers.jl:80
│     [31] solve!(cache::OptimizationCache{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#11#12", OptimizationBase.var"#grad#16"{SciMLBase.NullParameters, OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#11#12", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, AutoForwardDiff{nothing, Nothing}}, OptimizationBase.var"#fg!#18"{SciMLBase.NullParameters, OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#11#12", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, AutoForwardDiff{nothing, Nothing}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, SciMLBase.NullParameters}, Nothing, Nothing, Nothing, Nothing, Nothing, Optimisers.Adam, Bool, typeof(cb), Nothing})
│        @ SciMLBase ~/.julia/packages/SciMLBase/PTTHz/src/solve.jl:186
│     [32] solve(::OptimizationProblem{true, OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#11#12", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, ComponentVector{Float32, Vector{Float32}, Tuple{Axis{(layer_1 = ViewAxis(1:6, Axis(weight = ViewAxis(1:3, ShapedAxis((3, 1))), bias = 4:6)), layer_2 = ViewAxis(7:10, Axis(weight = ViewAxis(1:3, ShapedAxis((1, 3))), bias = 4:4)))}}}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, @Kwargs{}}, ::Optimisers.Adam; kwargs::@Kwargs{maxiters::Int64, callback::typeof(cb)})
│        @ SciMLBase ~/.julia/packages/SciMLBase/PTTHz/src/solve.jl:94
│     [33] top-level scope
│        @ ~/Dropbox (ASU)/Code/Julia/CNF/example.jl:29
│     [34] eval
│        @ ./boot.jl:385 [inlined]
│     [35] include_string(mapexpr::typeof(REPL.softscope), mod::Module, code::String, filename::String)
│        @ Base ./loading.jl:2076
│     [36] invokelatest(::Any, ::Any, ::Vararg{Any}; kwargs::@Kwargs{})
│        @ Base ./essentials.jl:892
│     [37] invokelatest(::Any, ::Any, ::Vararg{Any})
│        @ Base ./essentials.jl:889
│     [38] inlineeval(m::Module, code::String, code_line::Int64, code_column::Int64, file::String; softscope::Bool)
│        @ VSCodeServer ~/.vscode/extensions/julialang.language-julia-1.124.2/scripts/packages/VSCodeServer/src/eval.jl:271
│     [39] (::VSCodeServer.var"#69#74"{Bool, Bool, Bool, Module, String, Int64, Int64, String, VSCodeServer.ReplRunCodeRequestParams})()
│        @ VSCodeServer ~/.vscode/extensions/julialang.language-julia-1.124.2/scripts/packages/VSCodeServer/src/eval.jl:181
│     [40] withpath(f::VSCodeServer.var"#69#74"{Bool, Bool, Bool, Module, String, Int64, Int64, String, VSCodeServer.ReplRunCodeRequestParams}, path::String)
│        @ VSCodeServer ~/.vscode/extensions/julialang.language-julia-1.124.2/scripts/packages/VSCodeServer/src/repl.jl:276
│     [41] (::VSCodeServer.var"#68#73"{Bool, Bool, Bool, Module, String, Int64, Int64, String, VSCodeServer.ReplRunCodeRequestParams})()
│        @ VSCodeServer ~/.vscode/extensions/julialang.language-julia-1.124.2/scripts/packages/VSCodeServer/src/eval.jl:179
│     [42] hideprompt(f::VSCodeServer.var"#68#73"{Bool, Bool, Bool, Module, String, Int64, Int64, String, VSCodeServer.ReplRunCodeRequestParams})
│        @ VSCodeServer ~/.vscode/extensions/julialang.language-julia-1.124.2/scripts/packages/VSCodeServer/src/repl.jl:38
│     [43] (::VSCodeServer.var"#67#72"{Bool, Bool, Bool, Module, String, Int64, Int64, String, VSCodeServer.ReplRunCodeRequestParams})()
│        @ VSCodeServer ~/.vscode/extensions/julialang.language-julia-1.124.2/scripts/packages/VSCodeServer/src/eval.jl:150
│     [44] with_logstate(f::Function, logstate::Any)
│        @ Base.CoreLogging ./logging.jl:515
│     [45] with_logger
│        @ ./logging.jl:627 [inlined]
│     [46] (::VSCodeServer.var"#66#71"{VSCodeServer.ReplRunCodeRequestParams})()
│        @ VSCodeServer ~/.vscode/extensions/julialang.language-julia-1.124.2/scripts/packages/VSCodeServer/src/eval.jl:263
│     [47] #invokelatest#2
│        @ ./essentials.jl:892 [inlined]
│     [48] invokelatest(::Any)
│        @ Base ./essentials.jl:889
│     [49] (::VSCodeServer.var"#64#65")()
│        @ VSCodeServer ~/.vscode/extensions/julialang.language-julia-1.124.2/scripts/packages/VSCodeServer/src/eval.jl:34
└ @ Main ~/Dropbox (ASU)/Code/Julia/CNF/example.jl:21

Environment:

Project
Status `~/ASU Dropbox/Xu Weiqing/Code/Julia/CNF/Project.toml`
  [b0b7db55] ComponentArrays v0.15.17
  [aae7a2af] DiffEqFlux v4.0.0
  [31c24e10] Distributions v0.25.112
  [d0bbae9a] LuxCUDA v0.3.3
  [7f7a1694] Optimization v4.0.3
  [36348300] OptimizationOptimJL v0.4.1
  [42dfb2eb] OptimizationOptimisers v0.3.3
  [1dea7af3] OrdinaryDiffEq v6.89.0
  [9a3f8284] Random
Manifest
Status `~/ASU Dropbox/Xu Weiqing/Code/Julia/CNF/Manifest.toml`
  [47edcb42] ADTypes v1.9.0
  [621f4979] AbstractFFTs v1.5.0
  [1520ce14] AbstractTrees v0.4.5
  [7d9f7c33] Accessors v0.1.38
  [79e6a3ab] Adapt v4.0.4
  [66dad0bd] AliasTables v1.1.3
  [dce04be8] ArgCheck v2.3.0
  [ec485272] ArnoldiMethod v0.4.0
  [4fba245c] ArrayInterface v7.16.0
  [4c555306] ArrayLayouts v1.10.3
  [a9b6321e] Atomix v0.1.0
  [ab4f0b2a] BFloat16s v0.5.0
  [62783981] BitTwiddlingConvenienceFunctions v0.1.6
  [4544d5e4] Boltz v1.0.1
  [fa961155] CEnum v0.5.0
  [2a0fbf3d] CPUSummary v0.2.6
  [052768ef] CUDA v5.5.2
  [1af6417a] CUDA_Runtime_Discovery v0.3.5
  [7057c7e9] Cassette v0.3.13
  [082447d4] ChainRules v1.71.0
  [d360d2e6] ChainRulesCore v1.25.0
  [fb6a15b2] CloseOpenIntervals v0.1.13
  [3da002f7] ColorTypes v0.11.5
  [5ae59095] Colors v0.12.11
  [38540f10] CommonSolve v0.2.4
  [bbf7d656] CommonSubexpressions v0.3.1
  [f70d9fcc] CommonWorldInvalidations v1.0.0
  [34da2185] Compat v4.16.0
  [b0b7db55] ComponentArrays v0.15.17
  [a33af91c] CompositionsBase v0.1.2
  [2569d6c7] ConcreteStructs v0.2.3
  [88cd18e8] ConsoleProgressMonitor v0.1.2
  [187b0558] ConstructionBase v1.5.8
  [adafc99b] CpuId v0.3.1
  [a8cc5b0e] Crayons v4.1.1
  [9a962f9c] DataAPI v1.16.0
  [a93c6f00] DataFrames v1.7.0
  [864edb3b] DataStructures v0.18.20
  [e2d170a0] DataValueInterfaces v1.0.0
  [2b5f629d] DiffEqBase v6.157.0
⌅ [459566f4] DiffEqCallbacks v3.9.1
  [aae7a2af] DiffEqFlux v4.0.0
  [77a26b50] DiffEqNoiseProcess v5.23.0
  [163ba53b] DiffResults v1.1.0
  [b552c78f] DiffRules v1.15.1
  [a0c0ee7d] DifferentiationInterface v0.6.6
  [8d63f2c5] DispatchDoctor v0.4.15
  [31c24e10] Distributions v0.25.112
  [ffbed154] DocStringExtensions v0.9.3
  [da5c29d0] EllipsisNotation v1.8.0
  [4e289a0a] EnumX v1.0.4
  [7da242da] Enzyme v0.13.7
  [f151be2c] EnzymeCore v0.8.4
  [d4d017d3] ExponentialUtilities v1.26.1
  [e2ba6199] ExprTools v0.1.10
⌅ [6b7a57c9] Expronicon v0.8.5
  [7034ab61] FastBroadcast v0.3.5
  [9aa1b823] FastClosures v0.3.2
  [29a986be] FastLapackInterface v2.0.4
  [1a297f60] FillArrays v1.13.0
  [6a86dc24] FiniteDiff v2.26.0
  [53c48c17] FixedPointNumbers v0.8.5
  [f6369f11] ForwardDiff v0.10.36
  [f62d2435] FunctionProperties v0.1.2
  [069b7b12] FunctionWrappers v1.1.3
  [77dc65aa] FunctionWrappersWrappers v0.1.3
  [d9f16b24] Functors v0.4.12
  [0c68f7d7] GPUArrays v10.3.1
  [46192b85] GPUArraysCore v0.1.6
  [61eb1bfa] GPUCompiler v0.27.8
  [c145ed77] GenericSchur v0.5.4
  [86223c79] Graphs v1.12.0
  [3e5b6fbb] HostCPUFeatures v0.1.17
  [0e44f5e4] Hwloc v3.3.0
  [34004b35] HypergeometricFunctions v0.3.24
  [7869d1d1] IRTools v0.4.14
  [615f187c] IfElse v0.1.1
  [d25df0c9] Inflate v0.1.5
  [842dd82b] InlineStrings v1.4.2
  [3587e190] InverseFunctions v0.1.17
  [41ab1584] InvertedIndices v1.3.0
  [92d709cd] IrrationalConstants v0.2.2
  [82899510] IteratorInterfaceExtensions v1.0.0
  [692b3bcd] JLLWrappers v1.6.0
  [ef3ab10e] KLU v0.6.0
  [63c18a36] KernelAbstractions v0.9.28
  [ba0b0d4f] Krylov v0.9.6
  [5be7bae1] LBFGSB v0.4.1
  [929cbde3] LLVM v9.1.2
  [8b046642] LLVMLoopInfo v1.0.0
  [b964fa9f] LaTeXStrings v1.3.1
  [10f19ff3] LayoutPointers v0.1.17
  [5078a376] LazyArrays v2.2.1
  [1d6d02ad] LeftChildRightSiblingTrees v0.2.0
  [87fe0de2] LineSearch v0.1.3
  [d3d80556] LineSearches v7.3.0
  [7ed4a6bd] LinearSolve v2.35.0
  [2ab3a3ac] LogExpFunctions v0.3.28
  [e6f89c97] LoggingExtras v1.0.3
  [bdcacae8] LoopVectorization v0.12.171
  [30fc2ffe] LossFunctions v0.11.2
  [b2108857] Lux v1.1.0
  [d0bbae9a] LuxCUDA v0.3.3
  [bb33d45b] LuxCore v1.0.1
  [82251201] LuxLib v1.3.2
  [7e8f7934] MLDataDevices v1.2.0
  [d8e11817] MLStyle v0.4.17
  [1914dd2f] MacroTools v0.5.13
  [d125e4d3] ManualMemory v0.1.8
  [bb5d69b7] MaybeInplace v0.1.4
  [e1d29d7a] Missings v1.2.0
  [46d2c3a1] MuladdMacro v0.2.4
  [d41bc354] NLSolversBase v7.8.3
  [872c559c] NNlib v0.9.24
  [5da4648a] NVTX v0.3.4
  [77ba4419] NaNMath v1.0.2
  [8913a72c] NonlinearSolve v3.15.1
  [d8793406] ObjectFile v0.4.2
  [6fd5a793] Octavian v0.3.28
  [6fe1bfb0] OffsetArrays v1.14.1
  [429524aa] Optim v1.9.4
  [3bd65402] Optimisers v0.3.3
  [7f7a1694] Optimization v4.0.3
  [bca83a33] OptimizationBase v2.3.0
  [36348300] OptimizationOptimJL v0.4.1
  [42dfb2eb] OptimizationOptimisers v0.3.3
  [bac558e1] OrderedCollections v1.6.3
  [1dea7af3] OrdinaryDiffEq v6.89.0
  [89bda076] OrdinaryDiffEqAdamsBashforthMoulton v1.1.0
  [6ad6398a] OrdinaryDiffEqBDF v1.1.2
  [bbf590c4] OrdinaryDiffEqCore v1.6.0
  [50262376] OrdinaryDiffEqDefault v1.1.0
  [4302a76b] OrdinaryDiffEqDifferentiation v1.1.0
  [9286f039] OrdinaryDiffEqExplicitRK v1.1.0
  [e0540318] OrdinaryDiffEqExponentialRK v1.1.0
  [becaefa8] OrdinaryDiffEqExtrapolation v1.1.0
  [5960d6e9] OrdinaryDiffEqFIRK v1.1.1
  [101fe9f7] OrdinaryDiffEqFeagin v1.1.0
  [d3585ca7] OrdinaryDiffEqFunctionMap v1.1.1
  [d28bc4f8] OrdinaryDiffEqHighOrderRK v1.1.0
  [9f002381] OrdinaryDiffEqIMEXMultistep v1.1.0
  [521117fe] OrdinaryDiffEqLinear v1.1.0
  [1344f307] OrdinaryDiffEqLowOrderRK v1.2.0
  [b0944070] OrdinaryDiffEqLowStorageRK v1.2.1
  [127b3ac7] OrdinaryDiffEqNonlinearSolve v1.2.1
  [c9986a66] OrdinaryDiffEqNordsieck v1.1.0
  [5dd0a6cf] OrdinaryDiffEqPDIRK v1.1.0
  [5b33eab2] OrdinaryDiffEqPRK v1.1.0
  [04162be5] OrdinaryDiffEqQPRK v1.1.0
  [af6ede74] OrdinaryDiffEqRKN v1.1.0
  [43230ef6] OrdinaryDiffEqRosenbrock v1.2.0
  [2d112036] OrdinaryDiffEqSDIRK v1.1.0
  [669c94d9] OrdinaryDiffEqSSPRK v1.2.0
  [e3e12d00] OrdinaryDiffEqStabilizedIRK v1.1.0
  [358294b1] OrdinaryDiffEqStabilizedRK v1.1.0
  [fa646aed] OrdinaryDiffEqSymplecticRK v1.1.0
  [b1df2697] OrdinaryDiffEqTsit5 v1.1.0
  [79d7bb75] OrdinaryDiffEqVerner v1.1.1
  [90014a1f] PDMats v0.11.31
  [65ce6f38] PackageExtensionCompat v1.0.2
  [d96e819e] Parameters v0.12.3
  [e409e4f3] PoissonRandom v0.4.4
  [f517fe37] Polyester v0.7.16
  [1d0040c9] PolyesterWeave v0.2.2
  [2dfb63ee] PooledArrays v1.4.3
  [85a6dd25] PositiveFactorizations v0.2.4
  [d236fae5] PreallocationTools v0.4.24
  [aea7be01] PrecompileTools v1.2.1
  [21216c6a] Preferences v1.4.3
  [08abe8d2] PrettyTables v2.4.0
  [33c8b6b6] ProgressLogging v0.1.4
  [92933f4c] ProgressMeter v1.10.2
  [43287f4e] PtrArrays v1.2.1
  [1fd47b50] QuadGK v2.11.1
  [74087812] Random123 v1.7.0
  [e6cf234a] RandomNumbers v1.6.0
  [c1ae055f] RealDot v0.1.0
  [3cdcf5f2] RecipesBase v1.3.4
  [731186ca] RecursiveArrayTools v3.27.0
  [f2c3362d] RecursiveFactorization v0.2.23
  [189a3867] Reexport v1.2.2
  [ae029012] Requires v1.3.0
  [ae5879a3] ResettableStacks v1.1.1
  [37e2e3b7] ReverseDiff v1.15.3
  [79098fc4] Rmath v0.8.0
  [7e49a35a] RuntimeGeneratedFunctions v0.5.13
  [94e857df] SIMDTypes v0.1.0
  [476501e8] SLEEFPirates v0.6.43
  [0bca4576] SciMLBase v2.56.0
  [19f34311] SciMLJacobianOperators v0.1.0
  [c0aeaf25] SciMLOperators v0.3.10
  [1ed8b502] SciMLSensitivity v7.68.0
  [53ae85a6] SciMLStructures v1.5.0
  [6c6a2e73] Scratch v1.2.1
  [91c51154] SentinelArrays v1.4.5
  [efcf1570] Setfield v1.1.1
  [727e6d20] SimpleNonlinearSolve v1.12.3
  [699a6c99] SimpleTraits v0.9.4
  [ce78b400] SimpleUnPack v1.1.0
  [a2af1166] SortingAlgorithms v1.2.1
  [9f842d2f] SparseConnectivityTracer v0.6.6
  [47a9eef4] SparseDiffTools v2.22.0
  [dc90abb0] SparseInverseSubset v0.1.2
  [0a514795] SparseMatrixColorings v0.4.6
  [e56a9233] Sparspak v0.3.9
  [276daf66] SpecialFunctions v2.4.0
  [aedffcd0] Static v1.1.1
  [0d7ed370] StaticArrayInterface v1.8.0
  [90137ffa] StaticArrays v1.9.7
  [1e83bf80] StaticArraysCore v1.4.3
  [82ae8749] StatsAPI v1.7.0
  [2913bbd2] StatsBase v0.34.3
  [4c63d2b9] StatsFuns v1.3.2
  [7792a7ef] StrideArraysCore v0.5.7
  [892a3eda] StringManipulation v0.4.0
  [09ab397b] StructArrays v0.6.18
  [53d494c1] StructIO v0.3.1
  [2efcf032] SymbolicIndexingInterface v0.3.31
  [3783bdb8] TableTraits v1.0.1
  [bd369af6] Tables v1.12.0
  [5d786b92] TerminalLoggers v0.1.7
  [8290d209] ThreadingUtilities v0.5.2
  [a759f4b9] TimerOutputs v0.5.24
  [9f7883ad] Tracker v0.2.35
  [d5829a12] TriangularSolve v0.2.1
  [410a4b4d] Tricks v0.1.9
  [781d530d] TruncatedStacktraces v1.4.0
  [3a884ed6] UnPack v1.0.2
  [013be700] UnsafeAtomics v0.2.1
  [d80eeb9a] UnsafeAtomicsLLVM v0.2.1
  [3d5dd08c] VectorizationBase v0.21.70
  [19fa3120] VertexSafeGraphs v0.2.0
  [d49dbf32] WeightInitializers v1.0.3
  [e88e6eb3] Zygote v0.6.71
  [700de1a5] ZygoteRules v0.2.5
  [02a925ec] cuDNN v1.4.0
  [4ee394cb] CUDA_Driver_jll v0.10.3+0
  [76a88914] CUDA_Runtime_jll v0.15.3+0
  [62b44479] CUDNN_jll v9.4.0+0
  [7cc45869] Enzyme_jll v0.0.153+0
  [e33a78d0] Hwloc_jll v2.11.2+0
  [1d5cc7b8] IntelOpenMP_jll v2024.2.1+0
  [9c1d0b0a] JuliaNVTXCallbacks_jll v0.2.1+0
  [dad2f222] LLVMExtra_jll v0.0.34+0
  [81d17ec3] L_BFGS_B_jll v3.0.1+0
  [856f044c] MKL_jll v2024.2.0+0
  [e98f9f5b] NVTX_jll v3.1.0+2
  [efe28fd5] OpenSpecFun_jll v0.5.5+0
  [f50d1b31] Rmath_jll v0.5.1+0
  [1e29f10c] demumble_jll v1.3.0+0
  [1317d2d5] oneTBB_jll v2021.12.0+0
  [0dad84c5] ArgTools v1.1.1
  [56f22d72] Artifacts
  [2a0f44e3] Base64
  [ade2ca70] Dates
  [8ba89e20] Distributed
  [f43a241f] Downloads v1.6.0
  [7b1f6079] FileWatching
  [9fa8497b] Future
  [b77e0a4c] InteractiveUtils
  [4af54fe1] LazyArtifacts
  [b27032c2] LibCURL v0.6.4
  [76f85450] LibGit2
  [8f399da3] Libdl
  [37e2e46d] LinearAlgebra
  [56ddb016] Logging
  [d6f4376e] Markdown
  [a63ad114] Mmap
  [ca575930] NetworkOptions v1.2.0
  [44cfe95a] Pkg v1.10.0
  [de0858da] Printf
  [3fa0cd96] REPL
  [9a3f8284] Random
  [ea8e919c] SHA v0.7.0
  [9e88b42a] Serialization
  [1a1011a3] SharedArrays
  [6462fe0b] Sockets
  [2f01184e] SparseArrays v1.10.0
  [10745b16] Statistics v1.10.0
  [4607b0f0] SuiteSparse
  [fa267f1f] TOML v1.0.3
  [a4e569a6] Tar v1.10.0
  [8dfed614] Test
  [cf7118a7] UUIDs
  [4ec0a83e] Unicode
  [e66e0078] CompilerSupportLibraries_jll v1.1.1+0
  [deac9b47] LibCURL_jll v8.4.0+0
  [e37daf67] LibGit2_jll v1.6.4+0
  [29816b5a] LibSSH2_jll v1.11.0+1
  [c8ffd9c3] MbedTLS_jll v2.28.2+1
  [14a3606d] MozillaCACerts_jll v2023.1.10
  [4536629a] OpenBLAS_jll v0.3.23+4
  [05823500] OpenLibm_jll v0.8.1+2
  [bea87d4a] SuiteSparse_jll v7.2.1+1
  [83775a58] Zlib_jll v1.2.13+1
  [8e850b90] libblastrampoline_jll v5.11.0+0
  [8e850ede] nghttp2_jll v1.52.0+1
  [3f19e933] p7zip_jll v17.4.0+2
Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. To see why use `status --outdated -m`
versioninfo
Julia Version 1.10.5
Commit 6f3fdf7b362 (2024-08-27 14:19 UTC)
Build Info:
  Official https://julialang.org/ release
Platform Info:
  OS: Linux (x86_64-linux-gnu)
  CPU: 8 × Intel(R) Core(TM) i7-7700K CPU @ 4.20GHz
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-15.0.7 (ORCJIT, skylake)
Threads: 1 default, 0 interactive, 1 GC (on 8 virtual cores)
Environment:
  JULIA_EDITOR = code
  JULIA_NUM_THREADS = 1
Thank you very much!
@lanceXwq lanceXwq added the bug Something isn't working label Oct 7, 2024
@jholland1
Copy link

I got a similar callback error message for the MNIST example in v3.5.0. MNIST example does not run to completion in v4.0 but does run to completion in v3.5.0 with similar errors in the callback.

#954

@ChrisRackauckas
Copy link
Member

This is addressed in the docs bump #950

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants