Skip to content

Commit 31276c9

Browse files
Add a VerboseDebug mode to catch errors and give more high level context
This should be a nice improvement overall to the health of the debugging experience. For example, the code from this post (https://discourse.julialang.org/t/optimizationmoi-ipopt-violating-inequality-constraint/92608) led to a question that took a bit to understand. But now when you run ```julia import Optimization import OptimizationMOI, Ipopt const AV{T} = AbstractVector{T} function model_constraints!(out::AV{<:Real}, u::AV{<:Real}, data) # Model parameters dt, a, b = u out[1] = a - 1/dt # Must be < 0 @info "Must be NEGATIVE: $(out[1])" end function model_variance(u::AV{T}, data::AV{<:Real}) where T<:Real # Model parameters dt, a, b = u # Compute variance variance = zeros(T, length(data)) variance[1] = one(T) for t in 1:(length(data) - 1) variance[t+1] = (1 - dt * a) * variance[t] + dt * data[t]^2 + dt * b end variance end function model_loss(u::AV{T}, data::AV{<:Real})::T where T<:Real variance = model_variance(u, data) loglik::T = zero(T) for (r, var) in zip(data, variance) loglik += -(log(2π) + log(var) + r^2 / var) / 2 end -loglik / length(data) end function model_fit(u0::AV{T}, data::AV{<:Real}) where T<:Real func = Optimization.OptimizationFunction( model_loss, Optimization.AutoForwardDiff(), cons=model_constraints! ) prob = Optimization.OptimizationProblem( func, u0, data, # 0 < dt < 1 && 1 < a < Inf && 0 < b < Inf lb=T[0.0, 1.0, 0.0], ub=T[1.0, Inf, Inf], # ^dt ^a ^b ^dt ^a ^b <= model parameters lcons=T[-Inf], ucons=T[0.0] # a - 1/dt < 0 ) sol = Optimization.solve(prob, Ipopt.Optimizer()) sol.u end let data = [ 2.1217711584057386, -0.28350145551002465, 2.3593492969513004, 0.192856733601849, 0.4566485836385113, 1.332717934013979, -1.286716619379847, 0.9868669960185211, 2.2358674776395224, -2.7933975791568098, 1.2555871497124622, 1.276879759908467, -0.8392016987911409, -1.1580875182201849, 0.33201646080578456, -0.17212553408696898, 1.1275285626369556, 0.23041139849229036, 1.648423577528424, 2.384823597473343, -0.4005518932539747, -1.117737311211693, -0.9490152960583265, -1.1454539355078672, 1.4158585811404159, -0.18926972177257692, -0.2867541528181491, -1.2077459688543788, -0.6397173049620141, 0.66147783407023, 0.049805188778543466, 0.902540117368457, -0.7018417933284938, 0.47342354473843684, 1.2620345361591596, -1.1483844812087018, -0.06487285080802752, 0.39020117013487715, -0.38454491504165356, 1.5125786171885645, -0.6751768274451174, 0.490916740658628, 0.012872300530924086, 0.46532447715746716, 0.34734421531357157, 0.3830452463549559, -0.8730874028738718, 0.4333151627834603, -0.40396180775692375, 2.0794821773418497, -0.5392735774960918, 0.6519326323752113, -1.4844713145398716, 0.3688828625691108, 1.010912990717231, 0.5018274939956874, 0.36656889279915833, -0.11403975693239479, -0.6460314660359935, -0.41997005020823147, 0.9652752515820495, -0.37375868692702047, -0.5780729659197872, 2.642742798278919, 0.5076984117208074, -0.4906395089461916, -1.804352047187329, -0.8596663844837792, -0.7510485548262176, -0.07922589350581195, 1.7201304839487317, 0.9024493222130577, -1.8216089665357902, 1.3929269238775426, -0.08410752079538407, 0.6423068180438288, 0.6615201016351212, 0.18546977816594887, -0.717521690742993, -1.0224309324751113, 1.7748350222721971, 0.1929546575877559, -0.1581871639724676, 0.20198379311238596, -0.6919373947349301, -0.9253274269423383, 0.549366272989534, -1.9302106783541606, 0.7197247279281573, -1.220334158468621, -0.9187468058921053, -2.1452607604834184, -2.1558650694862687, -0.9387913392336701, -0.676637835687265, -0.16621998352492198, 0.5637177022958897, -0.5258315560278541, 0.8413359958184765, -0.9096866525337141 ] # u0 = [0 < dt < 1, 1 < a < 1/dt, 0 < b < Inf] u0 = [0.3, 2.3333333333333335, 0.33333333333333337] @Assert 0 < u0[1] < 1 @Assert 1 < u0[2] < 1 / u0[1] @Assert 0 < u0[3] < Inf @info "Optimizing..." u0 model_fit(u0, data) end ``` you get: ```julia DomainError detected in the user `f` function. This occurs when the domain of a function is violated. For example, `log(-1.0)` is undefined because `log` of a real number is defined to only output real numbers, but `log` of a negative number is complex valued and therefore Julia throws a DomainError by default. Cases to be aware of include: * `log(x)`, `sqrt(x)`, `cbrt(x)`, etc. where `x<0` * `x^y` for `x<0` floating point `y` (example: `(-1.0)^(1/2) == im`) Within the context of SciML, this error can occur within the solver process even if the domain constraint would not be violated in the solution due to adaptivity. For example, an ODE solver or optimization routine may check a step at `new_u` which violates the domain constraint, and if violated reject the step and use a smaller `dt`. However, the throwing of this error will have halted the solving process. Thus the recommended fix is to replace this function with the equivalent ones from NaNMath.jl (https://github.com/JuliaMath/NaNMath.jl) which returns a NaN instead of an error. The solver will then effectively use the NaN within the error control routines to reject the out of bounds step. Additionally, one could perform a domain transformation on the variables so that such an issue does not occur in the definition of `f`. For more information, check out the following FAQ page: https://docs.sciml.ai/Optimization/stable/API/FAQ/#The-Solver-Seems-to-Violate-Constraints-During-the-Optimization,-Causing-DomainErrors,-What-Can-I-Do-About-That? Note that detailed debugging information adds a small amount of overhead to SciML solves which can be disabled with the keyword argument `debug = NoDebug()`. The detailed original error message information from Julia reproduced below: ERROR: DomainError with -2.4941978436429695: log will only return a complex result if called with a complex argument. Try log(Complex(x)). Stacktrace: [1] (::SciMLBase.VerboseDebugFunction{typeof(SciMLBase.__solve)})(::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Vararg{Any}) @ SciMLBase C:\Users\accou\.julia\dev\SciMLBase\src\debug.jl:66 [2] #solve#572 @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:87 [inlined] [3] solve @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:80 [inlined] [4] model_fit(u0::Vector{Float64}, data::Vector{Float64}) @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:77 [5] top-level scope @ c:\Users\accou\OneDrive\Computer\Desktop\test.jl:88 caused by: DomainError with -2.4941978436429695: log will only return a complex result if called with a complex argument. Try log(Complex(x)). Stacktrace: [1] throw_complex_domainerror(f::Symbol, x::Float64) @ Base.Math .\math.jl:33 [2] _log(x::Float64, base::Val{:ℯ}, func::Symbol) @ Base.Math .\special\log.jl:301 [3] log @ .\special\log.jl:267 [inlined] [4] model_loss(u::Vector{Float64}, data::Vector{Float64}) @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:60 [5] OptimizationFunction @ C:\Users\accou\.julia\dev\SciMLBase\src\scimlfunctions.jl:3580 [inlined] [6] eval_objective(moiproblem::OptimizationMOI.MOIOptimizationProblem{Float64, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Optimization.var"#57#74"{ForwardDiff.GradientConfig{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}}}, Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}}, Optimization.var"#60#77"{ForwardDiff.HessianConfig{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}, 3}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Float64}, Float64, 3}}}, Optimization.var"#56#73"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}}, Optimization.var"#63#80", Optimization.var"#64#81"{SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}}, Optimization.var"#66#83"{ForwardDiff.JacobianConfig{ForwardDiff.Tag{Optimization.var"#65#82"{Int64}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#65#82"{Int64}, Float64}, Float64, 3}}}}, Optimization.var"#71#88"{Int64, Vector{ForwardDiff.HessianConfig{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3}, 3}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#69#86"{Int64}, Float64}, Float64, 3}}}}, Vector{Optimization.var"#69#86"{Int64}}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Matrix{Float64}, Matrix{Float64}, Matrix{Float64}}, x::Vector{Float64}) @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:82 [7] eval_objective(model::Ipopt.Optimizer, x::Vector{Float64}) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:514 [8] (::Ipopt.var"#eval_f_cb#1"{Ipopt.Optimizer})(x::Vector{Float64}) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:597 [9] _Eval_F_CB(n::Int32, x_ptr::Ptr{Float64}, x_new::Int32, obj_value::Ptr{Float64}, user_data::Ptr{Nothing}) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\C_wrapper.jl:38 [10] IpoptSolve(prob::Ipopt.IpoptProblem) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\C_wrapper.jl:442 [11] optimize!(model::Ipopt.Optimizer) @ Ipopt C:\Users\accou\.julia\packages\Ipopt\rQctM\src\MOI_wrapper.jl:727 [12] __solve(prob::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::Ipopt.Optimizer; maxiters::Nothing, maxtime::Nothing, abstol::Nothing, reltol::Nothing, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}) @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:381 [13] __solve(prob::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::Ipopt.Optimizer) @ OptimizationMOI C:\Users\accou\.julia\packages\OptimizationMOI\cHl7S\src\OptimizationMOI.jl:327 [14] (::SciMLBase.VerboseDebugFunction{typeof(SciMLBase.__solve)})(::SciMLBase.OptimizationProblem{true, SciMLBase.OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, typeof(model_loss), Nothing, Nothing, Nothing, typeof(model_constraints!), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::Vararg{Any}) @ SciMLBase C:\Users\accou\.julia\dev\SciMLBase\src\debug.jl:59 [15] #solve#572 @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:87 [inlined] [16] solve @ c:\Users\accou\.julia\dev\SciMLBase\src\solve.jl:80 [inlined] [17] model_fit(u0::Vector{Float64}, data::Vector{Float64}) @ Main c:\Users\accou\OneDrive\Computer\Desktop\test.jl:77 [18] top-level scope @ c:\Users\accou\OneDrive\Computer\Desktop\test.jl:88 ```
1 parent 0cf6597 commit 31276c9

File tree

3 files changed

+78
-3
lines changed

3 files changed

+78
-3
lines changed

src/SciMLBase.jl

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -588,6 +588,7 @@ include("utils.jl")
588588
include("function_wrappers.jl")
589589
include("scimlfunctions.jl")
590590
include("alg_traits.jl")
591+
include("debug.jl")
591592

592593
unwrapped_f(f) = f
593594
unwrapped_f(f::Void) = unwrapped_f(f.f)

src/debug.jl

Lines changed: 71 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
abstract type DebugMode end
2+
struct NoDebug <: DebugMode end
3+
struct VerboseDebug <: DebugMode end
4+
5+
DETAILED_INFORMATION = """
6+
Note that detailed debugging information adds a small amount of overhead to SciML solves
7+
which can be disabled with the keyword argument `debug = NoDebug()`.
8+
9+
The detailed original error message information from Julia reproduced below:
10+
"""
11+
12+
DEFAULT_DEBUG_MSG = """
13+
Error detected inside of the run of the solver. For more detailed information about the error
14+
with SciML context and recommendations, try adding the keyword argument `debug = VerboseDebug()`
15+
16+
$DETAILED_INFORMATION
17+
"""
18+
19+
UNKNOWN_MSG = """
20+
An unclassified error occured during the solver process and SciML's `VerboseDebug` mode cannot give
21+
any more information. You can help improve the debug mode messages by reporting this error to
22+
https://github.com/SciML/SciMLBase.jl/issues with a reproducer of the error from which a high
23+
level description can be added.
24+
25+
$DETAILED_INFORMATION
26+
"""
27+
28+
DOMAINERROR_COMPLEX_MSG = """
29+
DomainError detected in the user `f` function. This occurs when the domain of a function is violated.
30+
For example, `log(-1.0)` is undefined because `log` of a real number is defined to only output real
31+
numbers, but `log` of a negative number is complex valued and therefore Julia throws a DomainError
32+
by default. Cases to be aware of include:
33+
34+
* `log(x)`, `sqrt(x)`, `cbrt(x)`, etc. where `x<0`
35+
* `x^y` for `x<0` floating point `y` (example: `(-1.0)^(1/2) == im`)
36+
37+
Within the context of SciML, this error can occur within the solver process even if the domain constriant
38+
would not be violated in the solution due to adaptivity. For example, an ODE solver or optimization
39+
routine may check a step at `new_u` which violates the domain constraint, and if violated reject the
40+
step and use a smaller `dt`. However, the throwing of this error will have haulted the solving process.
41+
42+
Thus the recommended fix is to replace this function with the equivalent ones from NaNMath.jl
43+
(https://github.com/JuliaMath/NaNMath.jl) which returns a NaN instead of an error. The solver will then
44+
effectively use the NaN within the error control routines to reject the out of bounds step. Additionally,
45+
one could perform a domain transformation on the variables so that such an issue does not occur in the
46+
definition of `f`.
47+
48+
For more information, check out the following FAQ page:
49+
https://docs.sciml.ai/Optimization/stable/API/FAQ/#The-Solver-Seems-to-Violate-Constraints-During-the-Optimization,-Causing-DomainErrors,-What-Can-I-Do-About-That?
50+
51+
$DETAILED_INFORMATION
52+
"""
53+
54+
struct VerboseDebugFunction{F}
55+
f::F
56+
end
57+
function (f::VerboseDebugFunction)(args...)
58+
try
59+
f.f(args...)
60+
catch e
61+
if e isa DomainError && occursin("will only return a complex result if called with a complex argument. Try ",e.msg)
62+
println(DOMAINERROR_COMPLEX_MSG)
63+
else
64+
println(UNKNOWN_MSG)
65+
end
66+
throw(e)
67+
end
68+
end
69+
70+
debugwrapfun(f,debug::VerboseDebug) = VerboseDebugFunction(f)
71+
debugwrapfun(f,debug::NoDebug) = f

src/solve.jl

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,7 @@ These common arguments are:
2323
- `abstol` (absolute tolerance in changes of the objective value)
2424
- `reltol` (relative tolerance in changes of the objective value)
2525
- `callback` (a callback function)
26+
- `debug = VerboseDebug()` (sets the debug mode for improved error messages)
2627
2728
If the chosen global optimizer employs a local optimization method,
2829
a similar set of common local optimizer arguments exists.
@@ -77,12 +78,13 @@ end
7778
```
7879
"""
7980
function solve(prob::OptimizationProblem, alg, args...;
80-
kwargs...)::AbstractOptimizationSolution
81+
debug = VerboseDebug(), kwargs...)::AbstractOptimizationSolution
82+
8183
if supports_opt_cache_interface(alg)
82-
solve!(init(prob, alg, args...; kwargs...))
84+
debugwrapfun(solve!,debug)(debugwrapfun(init,debug)(prob, alg, args...; kwargs...))
8385
else
8486
_check_opt_alg(prob, alg; kwargs...)
85-
__solve(prob, alg, args...; kwargs...)
87+
debugwrapfun(__solve,debug)(prob, alg, args...; kwargs...)
8688
end
8789
end
8890

@@ -135,6 +137,7 @@ These common arguments are:
135137
- `abstol` (absolute tolerance in changes of the objective value)
136138
- `reltol` (relative tolerance in changes of the objective value)
137139
- `callback` (a callback function)
140+
- `debug = VerboseDebug()` (sets the debug mode for improved error messages)
138141
139142
Some optimizer algorithms have special keyword arguments documented in the
140143
solver portion of the documentation and their respective documentation.

0 commit comments

Comments
 (0)