-
-
Notifications
You must be signed in to change notification settings - Fork 104
Add symbol based indexing for interpolated solutions #66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add symbol based indexing for interpolated solutions #66
Conversation
I always thought at some point one could unify the standard interpolations in SciMLBase and OrdinaryDiffEq etc. Maybe the additional arguments might make this more difficult? I guess it doesn't matter though since there is no push in this direction currently AFAIK. |
The lazy interpolations of the Verner methods make this a bit hard. Keeping the SciMLBase ones light makes sense IMO since the OrdinaryDiffEq.jl ones do a lot of very special stuff that nothing else needs. |
Yes, I had only the Hermite and linear interpolations in mind. |
That would make |
Codecov Report
@@ Coverage Diff @@
## master #66 +/- ##
==========================================
+ Coverage 11.00% 13.36% +2.35%
==========================================
Files 39 39
Lines 2852 2888 +36
==========================================
+ Hits 314 386 +72
+ Misses 2538 2502 -36
Continue to review full report at Codecov.
|
Seems like most tests are failing? |
src/solutions/ode_solutions.jl
Outdated
sol.interp(t,idxs,deriv,sol.prob.p,continuity) | ||
elseif issymbollike(idxs) | ||
if t isa Real | ||
interp_sol = augment(sol.interp([t],nothing,deriv,sol.prob.p,continuity), sol) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can't we use
interp_sol = augment(sol.interp([t],nothing,deriv,sol.prob.p,continuity), sol) | |
interp_sol = augment(sol.interp(t,nothing,deriv,sol.prob.p,continuity), sol) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We will have to add an extra constructor to DiffEqArray
in RecursiveArrayTools.jl to support this. https://github.com/SciML/RecursiveArrayTools.jl/blob/master/src/vector_of_array.jl#L42-L46 doesn't cover the case where A.u
is Vector{Real}.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will be added in SciML/RecursiveArrayTools.jl#148
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The current API of interpolation outputs Vector{<:Real}
when the input t isa Real
. DiffEqArray expects a vector of times t
(Vector{<:Real}
) and vector of corresponding states u
(Vector{Vector{<:Real}}
).
These two factors together make it difficult to accommodate what you are suggesting without making substantial changes to interpolation and DiffEqArray indexing.
src/solutions/ode_solutions.jl
Outdated
end | ||
else | ||
if t isa Real | ||
interp_sol = augment(sol.interp([t],nothing,deriv,sol.prob.p,continuity), sol) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here, can't we use
interp_sol = augment(sol.interp([t],nothing,deriv,sol.prob.p,continuity), sol) | |
interp_sol = augment(sol.interp(t,nothing,deriv,sol.prob.p,continuity), sol) |
src/solutions/ode_solutions.jl
Outdated
function (sol::ODESolution)(t::AbstractVector{<:Real},::Type{Val{0}},idxs::AbstractVector,continuity) | ||
if any(.!issymbollike.(idxs)) | ||
error("Incorrect specification of `idxs`") | ||
end | ||
interp_sol = augment(sol.interp(t,nothing,Val{0},sol.prob.p,continuity), sol) | ||
u = [[interp_sol[idx][i] for idx in idxs] for i in 1:length(t)] | ||
DiffEqArray(u, t) | ||
end | ||
|
||
for T in 1:3 | ||
function (sol::ODESolution)(t,::Type{Val{T}},idxs,continuity) | ||
error("Higher-order interpolation is not implemented.") | ||
end | ||
end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this approach okay to handle higher-order interpolation? @devmotion
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just don't define these functions?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess the idea was to have a user-readable error. @YingboMa Any thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about
function (sol::ODESolution)(t::AbstractVector{<:Real},::Type{Val{N}},idxs::AbstractVector,continuity) where N
N == 0 || error("Higher-order interpolation is not implemented.")
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@YingboMa I think erroring out at sol(t, idxs=[...])
stage for higher-order interpolation is wrong (as @devmotion already mentioned). I just realized that it has been already been implemented in many scenarios.
https://github.com/SciML/SciMLBase.jl/blob/master/src/interpolation.jl#L274-L314
https://github.com/SciML/SciMLBase.jl/blob/master/src/interpolation.jl#L335-L369
https://github.com/SciML/SciMLBase.jl/blob/master/src/interpolation.jl#L387
https://github.com/SciML/SciMLBase.jl/blob/master/src/interpolation.jl#L414
https://github.com/SciML/SciMLBase.jl/blob/master/src/interpolation.jl#L439
https://github.com/SciML/SciMLBase.jl/blob/master/src/interpolation.jl#L465
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have set a default at interpolant
stage b97346c .
Weird. The downstream tests seem to pass locally. |
https://github.com/SciML/SciMLBase.jl/pull/66/checks?check_run_id=2961581350#step:6:881 that one is slightly stochastic (we need to make it take more samples), so that's fine and would almost certainly pass if re-tested. https://github.com/SciML/SciMLBase.jl/pull/66/checks?check_run_id=2961581076 is worrisome: do your symbolic versions match? |
@ChrisRackauckas I updated it to v1.2.1 and reconfirmed SciMLBase's downstream tests. Could you restart just that particular workflow? |
Okay, it errored again. I will take a closer look. |
5aff2b4
to
b97346c
Compare
@YingboMa @ChrisRackauckas It looks like |
We need to just get rid of that hack. @YingboMa @shashi can we just add |
We are renaming istree i today isterm so yes it’s already in TermInterface. We just need to use it in SU :p |
Okay yes, so try and prioritize that and we'll change this from the hacky thing to some traits and proper dispatches. |
@ChrisRackauckas The tests for this PR will now probably pass once SciML/RecursiveArrayTools.jl#152 is merged and released. |
Shit that was a mistake. Reverting. |
This should be a nice improvement overall to the health of the debugging experience. For example, the code from this post (https://discourse.julialang.org/t/optimizationmoi-ipopt-violating-inequality-constraint/92608) led to a question that took a bit to understand. But now when you run ```julia import Optimization import OptimizationMOI, Ipopt const AV{T} = AbstractVector{T} function model_constraints!(out::AV{<:Real}, u::AV{<:Real}, data) # Model parameters dt, a, b = u out[1] = a - 1/dt # Must be < 0 @info "Must be NEGATIVE: $(out[1])" end function model_variance(u::AV{T}, data::AV{<:Real}) where T<:Real # Model parameters dt, a, b = u # Compute variance variance = zeros(T, length(data)) variance[1] = one(T) for t in 1:(length(data) - 1) variance[t+1] = (1 - dt * a) * variance[t] + dt * data[t]^2 + dt * b end variance end function model_loss(u::AV{T}, data::AV{<:Real})::T where T<:Real variance = model_variance(u, data) loglik::T = zero(T) for (r, var) in zip(data, variance) loglik += -(log(2π) + log(var) + r^2 / var) / 2 end -loglik / length(data) end function model_fit(u0::AV{T}, data::AV{<:Real}) where T<:Real func = Optimization.OptimizationFunction( model_loss, Optimization.AutoForwardDiff(), cons=model_constraints! ) prob = Optimization.OptimizationProblem( func, u0, data, # 0 < dt < 1 && 1 < a < Inf && 0 < b < Inf lb=T[0.0, 1.0, 0.0], ub=T[1.0, Inf, Inf], # ^dt ^a ^b ^dt ^a ^b <= model parameters lcons=T[-Inf], ucons=T[0.0] # a - 1/dt < 0 ) sol = Optimization.solve(prob, Ipopt.Optimizer()) sol.u end let data = [ 2.1217711584057386, -0.28350145551002465, 2.3593492969513004, 0.192856733601849, 0.4566485836385113, 1.332717934013979, -1.286716619379847, 0.9868669960185211, 2.2358674776395224, -2.7933975791568098, 1.2555871497124622, 1.276879759908467, -0.8392016987911409, -1.1580875182201849, 0.33201646080578456, -0.17212553408696898, 1.1275285626369556, 0.23041139849229036, 1.648423577528424, 2.384823597473343, -0.4005518932539747, -1.117737311211693, -0.9490152960583265, -1.1454539355078672, 1.4158585811404159, -0.18926972177257692, -0.2867541528181491, -1.2077459688543788, -0.6397173049620141, 0.66147783407023, 0.049805188778543466, 0.902540117368457, -0.7018417933284938, 0.47342354473843684, 1.2620345361591596, -1.1483844812087018, -0.06487285080802752, 0.39020117013487715, -0.38454491504165356, 1.5125786171885645, -0.6751768274451174, 0.490916740658628, 0.012872300530924086, 0.46532447715746716, 0.34734421531357157, 0.3830452463549559, -0.8730874028738718, 0.4333151627834603, -0.40396180775692375, 2.0794821773418497, -0.5392735774960918, 0.6519326323752113, -1.4844713145398716, 0.3688828625691108, 1.010912990717231, 0.5018274939956874, 0.36656889279915833, -0.11403975693239479, -0.6460314660359935, -0.41997005020823147, 0.9652752515820495, -0.37375868692702047, -0.5780729659197872, 2.642742798278919, 0.5076984117208074, -0.4906395089461916, -1.804352047187329, -0.8596663844837792, -0.7510485548262176, -0.07922589350581195, 1.7201304839487317, 0.9024493222130577, -1.8216089665357902, 1.3929269238775426, -0.08410752079538407, 0.6423068180438288, 0.6615201016351212, 0.18546977816594887, -0.717521690742993, -1.0224309324751113, 1.7748350222721971, 0.1929546575877559, -0.1581871639724676, 0.20198379311238596, -0.6919373947349301, -0.9253274269423383, 0.549366272989534, -1.9302106783541606, 0.7197247279281573, -1.220334158468621, -0.9187468058921053, -2.1452607604834184, -2.1558650694862687, -0.9387913392336701, -0.676637835687265, -0.16621998352492198, 0.5637177022958897, -0.5258315560278541, 0.8413359958184765, -0.9096866525337141 ] # u0 = [0 < dt < 1, 1 < a < 1/dt, 0 < b < Inf] u0 = [0.3, 2.3333333333333335, 0.33333333333333337] @Assert 0 < u0[1] < 1 @Assert 1 < u0[2] < 1 / u0[1] @Assert 0 < u0[3] < Inf @info "Optimizing..." u0 model_fit(u0, data) end ``` you get: ```julia DomainError detected in the user `f` function. This occurs when the domain of a function is violated. For example, `log(-1.0)` is undefined because `log` of a real number is defined to only output real numbers, but `log` of a negative number is complex valued and therefore Julia throws a DomainError by default. Cases to be aware of include: * `log(x)`, `sqrt(x)`, `cbrt(x)`, etc. where `x<0` * `x^y` for `x<0` floating point `y` (example: `(-1.0)^(1/2) == im`) Within the context of SciML, this error can occur within the solver process even if the domain constraint would not be violated in the solution due to adaptivity. For example, an ODE solver or optimization routine may check a step at `new_u` which violates the domain constraint, and if violated reject the step and use a smaller `dt`. However, the throwing of this error will have halted the solving process. Thus the recommended fix is to replace this function with the equivalent ones from NaNMath.jl (https://github.com/JuliaMath/NaNMath.jl) which returns a NaN instead of an error. The solver will then effectively use the NaN within the error control routines to reject the out of bounds step. Additionally, one could perform a domain transformation on the variables so that such an issue does not occur in the definition of `f`. For more information, check out the following FAQ page: https://docs.sciml.ai/Optimization/stable/API/FAQ/#The-Solver-Seems-to-Violate-Constraints-During-the-Optimization,-Causing-DomainErrors,-What-Can-I-Do-About-That? Note that detailed debugging information adds a small amount of overhead to SciML solves which can be disabled with the keyword argument `debug = NoDebug()`. The detailed original error message information from Julia reproduced below: ERROR: DomainError with -2.4941978436429695: log will only return a complex result if called with a complex argument. Try log(Complex(x)). Stacktrace: [1] (::SciMLBase.VerboseDebugFunction{typeof(SciMLBase.__solve)})(::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Vararg{Any}) @ SciMLBase C:\Users\accou\.julia\dev\SciMLBase\src\debug.jl:66 [2] #solve#572 @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:87 [inlined] [3] solve @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:80 [inlined] [4] model_fit(u0::Vector{Float64}, data::Vector{Float64}) @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:77 [5] top-level scope @ c:\Users\accou\OneDrive\Computer\Desktop\test.jl:88 caused by: DomainError with -2.4941978436429695: log will only return a complex result if called with a complex argument. Try log(Complex(x)). Stacktrace: [1] throw_complex_domainerror(f::Symbol, x::Float64) @ Base.Math .\math.jl:33 [2] _log(x::Float64, base::Val{:ℯ}, func::Symbol) @ Base.Math .\special\log.jl:301 [3] log @ .\special\log.jl:267 [inlined] [4] model_loss(u::Vector{Float64}, data::Vector{Float64}) @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:60 [5] OptimizationFunction @ C:\Users\accou\.julia\dev\SciMLBase\src\scimlfunctions.jl:3580 [inlined] [6] eval_objective(moiproblem::OptimizationMOI.MOIOptimizationProblem{Float64, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Optimization.var"#57#74"{ForwardDiff.GradientConfig{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}}}, Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}}, Optimization.var"#60#77"{ForwardDiff.HessianConfig{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}, 3}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}}}, Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}}, Optimization.var"#63#80", Optimization.var"#64#81"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Optimization.var"#66#83"{ForwardDiff.JacobianConfig{ForwardDiff.Tag{Optimization.var"#65#82"{Int64}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#65#82"{Int64}, Float64}, Float64, 3}}}}, Optimization.var"#71#88"{Int64, Vector{ForwardDiff.HessianConfig{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3}, 3}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3}}}}, Vector{Optimization.var"#69#86"{Int64}}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Matrix{Float64}, Matrix{Float64}, Matrix{Float64}}, x::Vector{Float64}) @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:82 [7] eval_objective(model::Ipopt.Optimizer, x::Vector{Float64}) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:514 [8] (::Ipopt.var"#eval_f_cb#1"{Ipopt.Optimizer})(x::Vector{Float64}) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:597 [9] _Eval_F_CB(n::Int32, x_ptr::Ptr{Float64}, x_new::Int32, obj_value::Ptr{Float64}, user_data::Ptr{Nothing}) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\C_wrapper.jl:38 [10] IpoptSolve(prob::Ipopt.IpoptProblem) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\C_wrapper.jl:442 [11] optimize!(model::Ipopt.Optimizer) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:727 [12] __solve(prob::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::Ipopt.Optimizer; maxiters::Nothing, maxtime::Nothing, abstol::Nothing, reltol::Nothing, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}) @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:381 [13] __solve(prob::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::Ipopt.Optimizer) @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:327 [14] (::SciMLBase.VerboseDebugFunction{typeof(SciMLBase.__solve)})(::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Vararg{Any}) @ SciMLBase C:\Users\accou\.julia\dev\SciMLBase\src\debug.jl:59 [15] #solve#572 @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:87 [inlined] [16] solve @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:80 [inlined] [17] model_fit(u0::Vector{Float64}, data::Vector{Float64}) @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:77 [18] top-level scope @ c:\Users\accou\OneDrive\Computer\Desktop\test.jl:88 ```
This PR requires SciML/RecursiveArrayTools.jl#143 and does the following: