-
-
Notifications
You must be signed in to change notification settings - Fork 104
Updating SciML to ChainRules #69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@oxinabox for comfort. |
Part of SciML/SciMLBase.jl#69 And needs to be done with SciML/SciMLSensitivity.jl#428 But currently getting: ```julia using DiffEqSensitivity, OrdinaryDiffEq, Zygote function fiip(du,u,p,t) du[1] = dx = p[1]*u[1] - p[2]*u[1]*u[2] du[2] = dy = -p[3]*u[2] + p[4]*u[1]*u[2] end function foop(u,p,t) dx = p[1]*u[1] - p[2]*u[1]*u[2] dy = -p[3]*u[2] + p[4]*u[1]*u[2] [dx,dy] end p = [1.5,1.0,3.0,1.0]; u0 = [1.0;1.0] prob = ODEProblem(fiip,u0,(0.0,10.0),p) du01,dp1 = Zygote.gradient((u0,p)->sum(solve(prob,Tsit5(),u0=u0,p=p,abstol=1e-14, reltol=1e-14,saveat=0.1,sensealg=QuadratureAdjoint())),u0,p) ``` ```julia ArgumentError: tuple must be non-empty first(#unused#::Tuple{}) at tuple.jl:134 _unapply(t::Nothing, xs::Tuple{}) at lib.jl:163 _unapply(t::Tuple{Nothing}, xs::Tuple{}) at lib.jl:167 _unapply(t::Tuple{Tuple{Nothing}}, xs::Tuple{}) at lib.jl:167 _unapply(t::Tuple{NTuple{6, Nothing}, Tuple{Nothing}}, xs::Tuple{Nothing, Nothing, Nothing, Vector{Float64}, Vector{Float64}, Nothing}) at lib.jl:168 unapply(t::Tuple{NTuple{6, Nothing}, Tuple{Nothing}}, xs::Tuple{Nothing, Nothing, Nothing, Vector{Float64}, Vector{Float64}, Nothing}) at lib.jl:177 #193 at lib.jl:195 [inlined] (::Zygote.var"#1713#back#195"{Zygote.var"#193#194"{Tuple{NTuple{6, Nothing}, Tuple{Nothing}}, Zygote.var"#kw_zpullback#40"{DiffEqSensitivity.var"#adjoint_sensitivity_backpass#179"{Base.Iterators.Pairs{Symbol, Float64, Tuple{Symbol, Symbol}, NamedTuple{(:abstol, :reltol), Tuple{Float64, Float64}}}, Tsit5, QuadratureAdjoint{0, true, Val{:central}, Bool}, Vector{Float64}, Vector{Float64}, Tuple{}, Colon, NamedTuple{(:abstol, :reltol), Tuple{Float64, Float64}}}}}})(Δ::FillArrays.Fill{Float64, 2, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}) at adjoint.jl:59 Pullback at solve.jl:70 [inlined] (::typeof(∂(#solve#59)))(Δ::FillArrays.Fill{Float64, 2, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}) at interface2.jl:0 (::Zygote.var"#193#194"{Tuple{NTuple{6, Nothing}, Tuple{Nothing}}, typeof(∂(#solve#59))})(Δ::FillArrays.Fill{Float64, 2, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}) at lib.jl:194 (::Zygote.var"#1713#back#195"{Zygote.var"#193#194"{Tuple{NTuple{6, Nothing}, Tuple{Nothing}}, typeof(∂(#solve#59))}})(Δ::FillArrays.Fill{Float64, 2, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}) at adjoint.jl:59 Pullback at solve.jl:68 [inlined] (::typeof(∂(solve##kw)))(Δ::FillArrays.Fill{Float64, 2, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}) at interface2.jl:0 Pullback at test.jl:14 [inlined] (::typeof(∂(#7)))(Δ::Float64) at interface2.jl:0 (::Zygote.var"#46#47"{typeof(∂(#7))})(Δ::Float64) at interface.jl:41 gradient(::Function, ::Vector{Float64}, ::Vararg{Vector{Float64}, N} where N) at interface.jl:59 top-level scope at test.jl:14 eval at boot.jl:360 [inlined] ```
The current blocking issues:
|
Downstream testing revealed that Zygote actually ignored these rules 🤦 so those need to get reverted. |
The end result of this is that in order to pass tests, all packages needed to keep a few ZygoteRules so it doesn't seem possible at this time to use strictly ChainRules for all of this. This should be double checked in the future (with Diffractor) |
using OrdinaryDiffEq, DiffEqSensitivity, DiffEqFlux, LinearAlgebra, Flux
nn = FastChain(FastDense(1,16),FastDense(16,16,tanh),FastDense(16,2))
initial = initial_params(nn)
function ode2!(u, p, t)
f1, f2 = nn([t],p)
[-f1^2; f2]
end
tspan = (0.0, 10.0)
prob = ODEProblem(ode2!, Complex{Float64}[0;0], tspan, initial)
function loss(p)
sol = last(solve(prob, Tsit5(), p=p, sensealg=BacksolveAdjoint(autojacvec=ZygoteVJP())))
return norm(sol)
end
result_ode = DiffEqFlux.sciml_train(loss, initial, ADAM(0.1), maxiters = 100) showed that Zygote skips function ChainRulesCore.rrule(::typeof(getindex),VA::AbstractVectorOfArray, i)
function AbstractVectorOfArray_getindex_adjoint(Δ)
Δ′ = [ (i == j ? Δ : zero(x)) for (x,j) in zip(VA.u, 1:length(VA))]
(NoTangent(),Δ′,NoTangent())
end
VA[i],AbstractVectorOfArray_getindex_adjoint
end
function ChainRulesCore.rrule(::typeof(getindex),VA::AbstractVectorOfArray, i, j...)
function AbstractVectorOfArray_getindex_adjoint(Δ)
Δ′ = zero(VA)
Δ′[i,j...] = Δ
(NoTangent(), Δ′, i,map(_ -> NoTangent(), j)...)
end
VA[i,j...],AbstractVectorOfArray_getindex_adjoint
end because it has its own getindex overloads. |
Zygote ignores: function ChainRulesCore.rrule(::Type{<:EnsembleGPUArray})
EnsembleGPUArray(0.0), _ -> NoTangent()
end so I needed to also keep: ZygoteRules.@adjoint function EnsembleGPUArray()
EnsembleGPUArray(0.0), _ -> nothing
end |
function ChainRulesCore.rrule(f::ODEFunction,u,p,t)
if f.vjp === nothing
ChainRulesCore.rrule(f.f,u,p,t)
else
f.vjp(u,p,t)
end
end was skipped, so I added back: ZygoteRules.@adjoint function (f::ODEFunction)(u,p,t)
if f.vjp === nothing
ZygoteRules._pullback(f.f,u,p,t)
else
f.vjp(u,p,t)
end
end
ZygoteRules.@adjoint! function (f::ODEFunction)(du,u,p,t)
if f.vjp === nothing
ZygoteRules._pullback(f.f,du,u,p,t)
else
f.vjp(du,u,p,t)
end
end |
This should be a nice improvement overall to the health of the debugging experience. For example, the code from this post (https://discourse.julialang.org/t/optimizationmoi-ipopt-violating-inequality-constraint/92608) led to a question that took a bit to understand. But now when you run ```julia import Optimization import OptimizationMOI, Ipopt const AV{T} = AbstractVector{T} function model_constraints!(out::AV{<:Real}, u::AV{<:Real}, data) # Model parameters dt, a, b = u out[1] = a - 1/dt # Must be < 0 @info "Must be NEGATIVE: $(out[1])" end function model_variance(u::AV{T}, data::AV{<:Real}) where T<:Real # Model parameters dt, a, b = u # Compute variance variance = zeros(T, length(data)) variance[1] = one(T) for t in 1:(length(data) - 1) variance[t+1] = (1 - dt * a) * variance[t] + dt * data[t]^2 + dt * b end variance end function model_loss(u::AV{T}, data::AV{<:Real})::T where T<:Real variance = model_variance(u, data) loglik::T = zero(T) for (r, var) in zip(data, variance) loglik += -(log(2π) + log(var) + r^2 / var) / 2 end -loglik / length(data) end function model_fit(u0::AV{T}, data::AV{<:Real}) where T<:Real func = Optimization.OptimizationFunction( model_loss, Optimization.AutoForwardDiff(), cons=model_constraints! ) prob = Optimization.OptimizationProblem( func, u0, data, # 0 < dt < 1 && 1 < a < Inf && 0 < b < Inf lb=T[0.0, 1.0, 0.0], ub=T[1.0, Inf, Inf], # ^dt ^a ^b ^dt ^a ^b <= model parameters lcons=T[-Inf], ucons=T[0.0] # a - 1/dt < 0 ) sol = Optimization.solve(prob, Ipopt.Optimizer()) sol.u end let data = [ 2.1217711584057386, -0.28350145551002465, 2.3593492969513004, 0.192856733601849, 0.4566485836385113, 1.332717934013979, -1.286716619379847, 0.9868669960185211, 2.2358674776395224, -2.7933975791568098, 1.2555871497124622, 1.276879759908467, -0.8392016987911409, -1.1580875182201849, 0.33201646080578456, -0.17212553408696898, 1.1275285626369556, 0.23041139849229036, 1.648423577528424, 2.384823597473343, -0.4005518932539747, -1.117737311211693, -0.9490152960583265, -1.1454539355078672, 1.4158585811404159, -0.18926972177257692, -0.2867541528181491, -1.2077459688543788, -0.6397173049620141, 0.66147783407023, 0.049805188778543466, 0.902540117368457, -0.7018417933284938, 0.47342354473843684, 1.2620345361591596, -1.1483844812087018, -0.06487285080802752, 0.39020117013487715, -0.38454491504165356, 1.5125786171885645, -0.6751768274451174, 0.490916740658628, 0.012872300530924086, 0.46532447715746716, 0.34734421531357157, 0.3830452463549559, -0.8730874028738718, 0.4333151627834603, -0.40396180775692375, 2.0794821773418497, -0.5392735774960918, 0.6519326323752113, -1.4844713145398716, 0.3688828625691108, 1.010912990717231, 0.5018274939956874, 0.36656889279915833, -0.11403975693239479, -0.6460314660359935, -0.41997005020823147, 0.9652752515820495, -0.37375868692702047, -0.5780729659197872, 2.642742798278919, 0.5076984117208074, -0.4906395089461916, -1.804352047187329, -0.8596663844837792, -0.7510485548262176, -0.07922589350581195, 1.7201304839487317, 0.9024493222130577, -1.8216089665357902, 1.3929269238775426, -0.08410752079538407, 0.6423068180438288, 0.6615201016351212, 0.18546977816594887, -0.717521690742993, -1.0224309324751113, 1.7748350222721971, 0.1929546575877559, -0.1581871639724676, 0.20198379311238596, -0.6919373947349301, -0.9253274269423383, 0.549366272989534, -1.9302106783541606, 0.7197247279281573, -1.220334158468621, -0.9187468058921053, -2.1452607604834184, -2.1558650694862687, -0.9387913392336701, -0.676637835687265, -0.16621998352492198, 0.5637177022958897, -0.5258315560278541, 0.8413359958184765, -0.9096866525337141 ] # u0 = [0 < dt < 1, 1 < a < 1/dt, 0 < b < Inf] u0 = [0.3, 2.3333333333333335, 0.33333333333333337] @Assert 0 < u0[1] < 1 @Assert 1 < u0[2] < 1 / u0[1] @Assert 0 < u0[3] < Inf @info "Optimizing..." u0 model_fit(u0, data) end ``` you get: ```julia DomainError detected in the user `f` function. This occurs when the domain of a function is violated. For example, `log(-1.0)` is undefined because `log` of a real number is defined to only output real numbers, but `log` of a negative number is complex valued and therefore Julia throws a DomainError by default. Cases to be aware of include: * `log(x)`, `sqrt(x)`, `cbrt(x)`, etc. where `x<0` * `x^y` for `x<0` floating point `y` (example: `(-1.0)^(1/2) == im`) Within the context of SciML, this error can occur within the solver process even if the domain constraint would not be violated in the solution due to adaptivity. For example, an ODE solver or optimization routine may check a step at `new_u` which violates the domain constraint, and if violated reject the step and use a smaller `dt`. However, the throwing of this error will have halted the solving process. Thus the recommended fix is to replace this function with the equivalent ones from NaNMath.jl (https://github.com/JuliaMath/NaNMath.jl) which returns a NaN instead of an error. The solver will then effectively use the NaN within the error control routines to reject the out of bounds step. Additionally, one could perform a domain transformation on the variables so that such an issue does not occur in the definition of `f`. For more information, check out the following FAQ page: https://docs.sciml.ai/Optimization/stable/API/FAQ/#The-Solver-Seems-to-Violate-Constraints-During-the-Optimization,-Causing-DomainErrors,-What-Can-I-Do-About-That? Note that detailed debugging information adds a small amount of overhead to SciML solves which can be disabled with the keyword argument `debug = NoDebug()`. The detailed original error message information from Julia reproduced below: ERROR: DomainError with -2.4941978436429695: log will only return a complex result if called with a complex argument. Try log(Complex(x)). Stacktrace: [1] (::SciMLBase.VerboseDebugFunction{typeof(SciMLBase.__solve)})(::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Vararg{Any}) @ SciMLBase C:\Users\accou\.julia\dev\SciMLBase\src\debug.jl:66 [2] #solve#572 @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:87 [inlined] [3] solve @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:80 [inlined] [4] model_fit(u0::Vector{Float64}, data::Vector{Float64}) @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:77 [5] top-level scope @ c:\Users\accou\OneDrive\Computer\Desktop\test.jl:88 caused by: DomainError with -2.4941978436429695: log will only return a complex result if called with a complex argument. Try log(Complex(x)). Stacktrace: [1] throw_complex_domainerror(f::Symbol, x::Float64) @ Base.Math .\math.jl:33 [2] _log(x::Float64, base::Val{:ℯ}, func::Symbol) @ Base.Math .\special\log.jl:301 [3] log @ .\special\log.jl:267 [inlined] [4] model_loss(u::Vector{Float64}, data::Vector{Float64}) @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:60 [5] OptimizationFunction @ C:\Users\accou\.julia\dev\SciMLBase\src\scimlfunctions.jl:3580 [inlined] [6] eval_objective(moiproblem::OptimizationMOI.MOIOptimizationProblem{Float64, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Optimization.var"#57#74"{ForwardDiff.GradientConfig{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}}}, Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}}, Optimization.var"#60#77"{ForwardDiff.HessianConfig{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}, 3}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}}}, Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}}, Optimization.var"#63#80", Optimization.var"#64#81"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Optimization.var"#66#83"{ForwardDiff.JacobianConfig{ForwardDiff.Tag{Optimization.var"#65#82"{Int64}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#65#82"{Int64}, Float64}, Float64, 3}}}}, Optimization.var"#71#88"{Int64, Vector{ForwardDiff.HessianConfig{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3}, 3}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3}}}}, Vector{Optimization.var"#69#86"{Int64}}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Matrix{Float64}, Matrix{Float64}, Matrix{Float64}}, x::Vector{Float64}) @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:82 [7] eval_objective(model::Ipopt.Optimizer, x::Vector{Float64}) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:514 [8] (::Ipopt.var"#eval_f_cb#1"{Ipopt.Optimizer})(x::Vector{Float64}) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:597 [9] _Eval_F_CB(n::Int32, x_ptr::Ptr{Float64}, x_new::Int32, obj_value::Ptr{Float64}, user_data::Ptr{Nothing}) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\C_wrapper.jl:38 [10] IpoptSolve(prob::Ipopt.IpoptProblem) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\C_wrapper.jl:442 [11] optimize!(model::Ipopt.Optimizer) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:727 [12] __solve(prob::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::Ipopt.Optimizer; maxiters::Nothing, maxtime::Nothing, abstol::Nothing, reltol::Nothing, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}) @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:381 [13] __solve(prob::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::Ipopt.Optimizer) @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:327 [14] (::SciMLBase.VerboseDebugFunction{typeof(SciMLBase.__solve)})(::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Vararg{Any}) @ SciMLBase C:\Users\accou\.julia\dev\SciMLBase\src\debug.jl:59 [15] #solve#572 @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:87 [inlined] [16] solve @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:80 [inlined] [17] model_fit(u0::Vector{Float64}, data::Vector{Float64}) @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:77 [18] top-level scope @ c:\Users\accou\OneDrive\Computer\Desktop\test.jl:88 ```
Uh oh!
There was an error while loading. Please reload this page.
The list of things to handle is:
The solve.jl one is a bit nasty because it calls all of the adjoints in DiffEqSensitivity.jl, so it's somewhat breaking, so we can... just do it really quickly.
@frankschae @YingboMa
The text was updated successfully, but these errors were encountered: