Skip to content

Add symbol based indexing for interpolated solutions #66

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 24 commits into from
Jul 5, 2021

Conversation

sharanry
Copy link
Contributor

@sharanry sharanry commented Jun 18, 2021

This PR requires SciML/RecursiveArrayTools.jl#143 and does the following:

@devmotion
Copy link
Member

I always thought at some point one could unify the standard interpolations in SciMLBase and OrdinaryDiffEq etc. Maybe the additional arguments might make this more difficult? I guess it doesn't matter though since there is no push in this direction currently AFAIK.

@ChrisRackauckas
Copy link
Member

The lazy interpolations of the Verner methods make this a bit hard. Keeping the SciMLBase ones light makes sense IMO since the OrdinaryDiffEq.jl ones do a lot of very special stuff that nothing else needs.

@devmotion
Copy link
Member

Yes, I had only the Hermite and linear interpolations in mind.

@ChrisRackauckas
Copy link
Member

That would make dense=false a type unstable change because it would change the underlying type for the interpolation, instead of just checking for the bool and doing the linear behavior. Constant prop may handle it, but 🤷

@codecov
Copy link

codecov bot commented Jun 21, 2021

Codecov Report

Merging #66 (b0341e0) into master (2c8d34d) will increase coverage by 2.35%.
The diff coverage is 80.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #66      +/-   ##
==========================================
+ Coverage   11.00%   13.36%   +2.35%     
==========================================
  Files          39       39              
  Lines        2852     2888      +36     
==========================================
+ Hits          314      386      +72     
+ Misses       2538     2502      -36     
Impacted Files Coverage Δ
src/interpolation.jl 0.00% <0.00%> (ø)
src/solutions/ode_solutions.jl 40.00% <82.14%> (+23.01%) ⬆️
src/solutions/solution_interface.jl 10.63% <100.00%> (+10.63%) ⬆️
src/scimlfunctions.jl 7.50% <0.00%> (+0.73%) ⬆️
src/problems/ode_problems.jl 12.30% <0.00%> (+1.37%) ⬆️
src/operators/operators.jl 5.26% <0.00%> (+5.26%) ⬆️
src/function_wrappers.jl 25.00% <0.00%> (+25.00%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 2c8d34d...b0341e0. Read the comment docs.

@sharanry sharanry requested a review from ChrisRackauckas June 28, 2021 06:23
@ChrisRackauckas
Copy link
Member

Seems like most tests are failing?

sol.interp(t,idxs,deriv,sol.prob.p,continuity)
elseif issymbollike(idxs)
if t isa Real
interp_sol = augment(sol.interp([t],nothing,deriv,sol.prob.p,continuity), sol)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't we use

Suggested change
interp_sol = augment(sol.interp([t],nothing,deriv,sol.prob.p,continuity), sol)
interp_sol = augment(sol.interp(t,nothing,deriv,sol.prob.p,continuity), sol)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We will have to add an extra constructor to DiffEqArray in RecursiveArrayTools.jl to support this. https://github.com/SciML/RecursiveArrayTools.jl/blob/master/src/vector_of_array.jl#L42-L46 doesn't cover the case where A.u is Vector{Real}.

Copy link
Contributor Author

@sharanry sharanry Jun 29, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The current API of interpolation outputs Vector{<:Real} when the input t isa Real. DiffEqArray expects a vector of times t (Vector{<:Real}) and vector of corresponding states u (Vector{Vector{<:Real}}).

These two factors together make it difficult to accommodate what you are suggesting without making substantial changes to interpolation and DiffEqArray indexing.

end
else
if t isa Real
interp_sol = augment(sol.interp([t],nothing,deriv,sol.prob.p,continuity), sol)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here, can't we use

Suggested change
interp_sol = augment(sol.interp([t],nothing,deriv,sol.prob.p,continuity), sol)
interp_sol = augment(sol.interp(t,nothing,deriv,sol.prob.p,continuity), sol)

Comment on lines 65 to 78
function (sol::ODESolution)(t::AbstractVector{<:Real},::Type{Val{0}},idxs::AbstractVector,continuity)
if any(.!issymbollike.(idxs))
error("Incorrect specification of `idxs`")
end
interp_sol = augment(sol.interp(t,nothing,Val{0},sol.prob.p,continuity), sol)
u = [[interp_sol[idx][i] for idx in idxs] for i in 1:length(t)]
DiffEqArray(u, t)
end

for T in 1:3
function (sol::ODESolution)(t,::Type{Val{T}},idxs,continuity)
error("Higher-order interpolation is not implemented.")
end
end
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this approach okay to handle higher-order interpolation? @devmotion

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just don't define these functions?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess the idea was to have a user-readable error. @YingboMa Any thoughts?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about

function (sol::ODESolution)(t::AbstractVector{<:Real},::Type{Val{N}},idxs::AbstractVector,continuity) where N
    N == 0 || error("Higher-order interpolation is not implemented.")

Copy link
Contributor Author

@sharanry sharanry Jul 1, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have set a default at interpolant stage b97346c .

@sharanry sharanry requested review from devmotion and YingboMa July 1, 2021 06:05
@sharanry
Copy link
Contributor Author

sharanry commented Jul 1, 2021

Weird. The downstream tests seem to pass locally.

@ChrisRackauckas
Copy link
Member

https://github.com/SciML/SciMLBase.jl/pull/66/checks?check_run_id=2961581350#step:6:881 that one is slightly stochastic (we need to make it take more samples), so that's fine and would almost certainly pass if re-tested.

https://github.com/SciML/SciMLBase.jl/pull/66/checks?check_run_id=2961581076 is worrisome: do your symbolic versions match?

@sharanry
Copy link
Contributor Author

sharanry commented Jul 1, 2021

@ChrisRackauckas I updated it to v1.2.1 and reconfirmed SciMLBase's downstream tests. Could you restart just that particular workflow?

@sharanry
Copy link
Contributor Author

sharanry commented Jul 1, 2021

Okay, it errored again. I will take a closer look.

@sharanry sharanry force-pushed the sy/interp_elim_var_access branch from 5aff2b4 to b97346c Compare July 1, 2021 14:59
@sharanry
Copy link
Contributor Author

sharanry commented Jul 1, 2021

@YingboMa @ChrisRackauckas It looks like issymbollike() is erroneously returning false on Symbolics.Num type object only in a sandboxed downstream test environment. It correctly returns true when I normally import ModellingToolkit/RecursiveArrayTools. Any idea what could possibly lead to such behavior?

@ChrisRackauckas
Copy link
Member

We need to just get rid of that hack. @YingboMa @shashi can we just add istree to https://github.com/JuliaSymbolics/TermInterface.jl/blob/master/src/TermInterface.jl and depend on it here for this?

@shashi
Copy link

shashi commented Jul 1, 2021

We are renaming istree i today isterm so yes it’s already in TermInterface. We just need to use it in SU :p

@ChrisRackauckas
Copy link
Member

Okay yes, so try and prioritize that and we'll change this from the hacky thing to some traits and proper dispatches.

@sharanry
Copy link
Contributor Author

sharanry commented Jul 5, 2021

@ChrisRackauckas The tests for this PR will now probably pass once SciML/RecursiveArrayTools.jl#152 is merged and released.

@ChrisRackauckas ChrisRackauckas merged commit 0ddde22 into SciML:master Jul 5, 2021
@ChrisRackauckas
Copy link
Member

Shit that was a mistake. Reverting.

ChrisRackauckas added a commit that referenced this pull request Jan 18, 2023
This should be a nice improvement overall to the health of the debugging experience. For example, the code from this post (https://discourse.julialang.org/t/optimizationmoi-ipopt-violating-inequality-constraint/92608) led to a question that took a bit to understand. But now when you run

```julia
import Optimization
import OptimizationMOI, Ipopt

const AV{T} = AbstractVector{T}

function model_constraints!(out::AV{<:Real}, u::AV{<:Real}, data)
    # Model parameters
    dt, a, b = u
    out[1] = a - 1/dt # Must be < 0
    @info "Must be NEGATIVE: $(out[1])"
end

function model_variance(u::AV{T}, data::AV{<:Real}) where T<:Real
    # Model parameters
    dt, a, b = u
    # Compute variance
    variance = zeros(T, length(data))
    variance[1] = one(T)
    for t in 1:(length(data) - 1)
        variance[t+1] = (1 - dt * a) * variance[t] + dt * data[t]^2 + dt * b
    end
    variance
end

function model_loss(u::AV{T}, data::AV{<:Real})::T where T<:Real
    variance = model_variance(u, data)
    loglik::T = zero(T)
    for (r, var) in zip(data, variance)
        loglik += -(log(2π) + log(var) + r^2 / var) / 2
    end
    -loglik / length(data)
end

function model_fit(u0::AV{T}, data::AV{<:Real}) where T<:Real
    func = Optimization.OptimizationFunction(
        model_loss, Optimization.AutoForwardDiff(),
        cons=model_constraints!
    )
    prob = Optimization.OptimizationProblem(
        func, u0, data,
        # 0 < dt < 1 && 1 < a < Inf && 0 < b < Inf
        lb=T[0.0, 1.0, 0.0], ub=T[1.0, Inf, Inf],
        #    ^dt  ^a   ^b         ^dt  ^a   ^b  <= model parameters 
        lcons=T[-Inf], ucons=T[0.0] # a - 1/dt < 0
    )
    sol = Optimization.solve(prob, Ipopt.Optimizer())
    sol.u
end

let 
    data = [
        2.1217711584057386, -0.28350145551002465, 2.3593492969513004, 0.192856733601849, 0.4566485836385113, 1.332717934013979, -1.286716619379847, 0.9868669960185211, 2.2358674776395224, -2.7933975791568098,
        1.2555871497124622, 1.276879759908467, -0.8392016987911409, -1.1580875182201849, 0.33201646080578456, -0.17212553408696898, 1.1275285626369556, 0.23041139849229036, 1.648423577528424, 2.384823597473343,
        -0.4005518932539747, -1.117737311211693, -0.9490152960583265, -1.1454539355078672, 1.4158585811404159, -0.18926972177257692, -0.2867541528181491, -1.2077459688543788, -0.6397173049620141, 0.66147783407023,
        0.049805188778543466, 0.902540117368457, -0.7018417933284938, 0.47342354473843684, 1.2620345361591596, -1.1483844812087018, -0.06487285080802752, 0.39020117013487715, -0.38454491504165356, 1.5125786171885645,
        -0.6751768274451174, 0.490916740658628, 0.012872300530924086, 0.46532447715746716, 0.34734421531357157, 0.3830452463549559, -0.8730874028738718, 0.4333151627834603, -0.40396180775692375, 2.0794821773418497,
        -0.5392735774960918, 0.6519326323752113, -1.4844713145398716, 0.3688828625691108, 1.010912990717231, 0.5018274939956874, 0.36656889279915833, -0.11403975693239479, -0.6460314660359935, -0.41997005020823147,
        0.9652752515820495, -0.37375868692702047, -0.5780729659197872, 2.642742798278919, 0.5076984117208074, -0.4906395089461916, -1.804352047187329, -0.8596663844837792, -0.7510485548262176, -0.07922589350581195,
        1.7201304839487317, 0.9024493222130577, -1.8216089665357902, 1.3929269238775426, -0.08410752079538407, 0.6423068180438288, 0.6615201016351212, 0.18546977816594887, -0.717521690742993, -1.0224309324751113,
        1.7748350222721971, 0.1929546575877559, -0.1581871639724676, 0.20198379311238596, -0.6919373947349301, -0.9253274269423383, 0.549366272989534, -1.9302106783541606, 0.7197247279281573, -1.220334158468621,
        -0.9187468058921053, -2.1452607604834184, -2.1558650694862687, -0.9387913392336701, -0.676637835687265, -0.16621998352492198, 0.5637177022958897, -0.5258315560278541, 0.8413359958184765, -0.9096866525337141
    ]
    # u0 = [0 < dt < 1, 1 < a < 1/dt, 0 < b < Inf]
    u0 = [0.3, 2.3333333333333335, 0.33333333333333337]
    @Assert 0 < u0[1] < 1
    @Assert 1 < u0[2] < 1 / u0[1]
    @Assert 0 < u0[3] < Inf
    @info "Optimizing..." u0
    model_fit(u0, data)
end
```

you get:

```julia
DomainError detected in the user `f` function. This occurs when the domain of a function is violated.
For example, `log(-1.0)` is undefined because `log` of a real number is defined to only output real
numbers, but `log` of a negative number is complex valued and therefore Julia throws a DomainError
by default. Cases to be aware of include:

* `log(x)`, `sqrt(x)`, `cbrt(x)`, etc. where `x<0`
* `x^y` for `x<0` floating point `y` (example: `(-1.0)^(1/2) == im`)

Within the context of SciML, this error can occur within the solver process even if the domain constraint
would not be violated in the solution due to adaptivity. For example, an ODE solver or optimization
routine may check a step at `new_u` which violates the domain constraint, and if violated reject the
step and use a smaller `dt`. However, the throwing of this error will have halted the solving process.

Thus the recommended fix is to replace this function with the equivalent ones from NaNMath.jl
(https://github.com/JuliaMath/NaNMath.jl) which returns a NaN instead of an error. The solver will then
effectively use the NaN within the error control routines to reject the out of bounds step. Additionally,
one could perform a domain transformation on the variables so that such an issue does not occur in the
definition of `f`.

For more information, check out the following FAQ page:
https://docs.sciml.ai/Optimization/stable/API/FAQ/#The-Solver-Seems-to-Violate-Constraints-During-the-Optimization,-Causing-DomainErrors,-What-Can-I-Do-About-That?

Note that detailed debugging information adds a small amount of overhead to SciML solves
which can be disabled with the keyword argument `debug = NoDebug()`.

The detailed original error message information from Julia reproduced below:


ERROR: DomainError with -2.4941978436429695:
log will only return a complex result if called with a complex argument. Try log(Complex(x)).
Stacktrace:
 [1] (::SciMLBase.VerboseDebugFunction{typeof(SciMLBase.__solve)})(::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Vararg{Any})
   @ SciMLBase C:\Users\accou\.julia\dev\SciMLBase\src\debug.jl:66
 [2] #solve#572
   @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:87 [inlined]
 [3] solve
   @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:80 [inlined]
 [4] model_fit(u0::Vector{Float64}, data::Vector{Float64})
   @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:77
 [5] top-level scope
   @ c:\Users\accou\OneDrive\Computer\Desktop\test.jl:88

caused by: DomainError with -2.4941978436429695:
log will only return a complex result if called with a complex argument. Try log(Complex(x)).
Stacktrace:
  [1] throw_complex_domainerror(f::Symbol, x::Float64)
    @ Base.Math .\math.jl:33
  [2] _log(x::Float64, base::Val{:ℯ}, func::Symbol)
    @ Base.Math .\special\log.jl:301
  [3] log
    @ .\special\log.jl:267 [inlined]
  [4] model_loss(u::Vector{Float64}, data::Vector{Float64})
    @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:60
  [5] OptimizationFunction
    @ C:\Users\accou\.julia\dev\SciMLBase\src\scimlfunctions.jl:3580 [inlined]
  [6] eval_objective(moiproblem::OptimizationMOI.MOIOptimizationProblem{Float64, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Optimization.var"#57#74"{ForwardDiff.GradientConfig{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}}}, Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}}, Optimization.var"#60#77"{ForwardDiff.HessianConfig{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}, 3}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}}}, Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}}, Optimization.var"#63#80", Optimization.var"#64#81"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Optimization.var"#66#83"{ForwardDiff.JacobianConfig{ForwardDiff.Tag{Optimization.var"#65#82"{Int64}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#65#82"{Int64}, Float64}, Float64, 3}}}}, Optimization.var"#71#88"{Int64, Vector{ForwardDiff.HessianConfig{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3}, 3}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3}}}}, Vector{Optimization.var"#69#86"{Int64}}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Matrix{Float64}, Matrix{Float64}, Matrix{Float64}}, x::Vector{Float64})
    @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:82
  [7] eval_objective(model::Ipopt.Optimizer, x::Vector{Float64})
    @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:514
  [8] (::Ipopt.var"#eval_f_cb#1"{Ipopt.Optimizer})(x::Vector{Float64})
    @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:597
  [9] _Eval_F_CB(n::Int32, x_ptr::Ptr{Float64}, x_new::Int32, obj_value::Ptr{Float64}, user_data::Ptr{Nothing})
    @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\C_wrapper.jl:38
 [10] IpoptSolve(prob::Ipopt.IpoptProblem)
    @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\C_wrapper.jl:442
 [11] optimize!(model::Ipopt.Optimizer)
    @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:727
 [12] __solve(prob::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::Ipopt.Optimizer; maxiters::Nothing, maxtime::Nothing, abstol::Nothing, reltol::Nothing, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:381
 [13] __solve(prob::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::Ipopt.Optimizer)
    @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:327
 [14] (::SciMLBase.VerboseDebugFunction{typeof(SciMLBase.__solve)})(::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Vararg{Any})
    @ SciMLBase C:\Users\accou\.julia\dev\SciMLBase\src\debug.jl:59
 [15] #solve#572
    @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:87 [inlined]
 [16] solve
    @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:80 [inlined]
 [17] model_fit(u0::Vector{Float64}, data::Vector{Float64})
    @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:77
 [18] top-level scope
    @ c:\Users\accou\OneDrive\Computer\Desktop\test.jl:88
```
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants