Skip to content

Add literal_getproperty disptach for VectorOfArray #478

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Aug 14, 2025

Conversation

DhairyaLGandhi
Copy link
Member

@DhairyaLGandhi DhairyaLGandhi commented Aug 12, 2025

Checklist

  • Appropriate tests were added
  • Any code changes were done in a way that does not break public API
  • All documentation related to code changes were updated
  • The new code follows the
    contributor guidelines, in particular the SciML Style Guide and
    COLPRAC.
  • Any new documentation only uses public API

Additional context

Seen in some places surfacing as

ERROR: MethodError: no method matching size(::@NamedTuple{u::Vector{Vector{Float64}}})
The function `size` exists, but no method is defined for this combination of argument types.
You may need to implement the `length` method or define `IteratorSize` for this type to be `SizeUnknown`.

Closest candidates are:
  size(::IdentityOperator)
   @ SciMLOperators ~/.julia/packages/SciMLOperators/Gd0Qg/src/basic.jl:21
  size(::Graphs.DefaultDistance)
   @ Graphs ~/.julia/packages/Graphs/awp48/src/distance.jl:22
  size(::CSV.ReversedBuf)
   @ CSV ~/.julia/packages/CSV/XLcqT/src/utils.jl:569
  ...

Stacktrace:
  [1] (::RecursiveArrayToolsZygoteExt.var"#48#53")(y::@NamedTuple{u::Vector{Vector{Float64}}})
    @ RecursiveArrayToolsZygoteExt ~/.julia/packages/RecursiveArrayTools/EfvwE/ext/RecursiveArrayToolsZygoteExt.jl:79
  [2] (::RecursiveArrayToolsZygoteExt.var"#90#back#56"{RecursiveArrayToolsZygoteExt.var"#48#53"})(Δ::Base.RefValue{Any})
    @ RecursiveArrayToolsZygoteExt ~/.julia/packages/ZygoteRules/CkVIK/src/adjoint.jl:72
  [3] vecofvec
    @ ~/.julia/dev/DyadModelOptimizer/src/experiments/experiment_api.jl:163 [inlined]
  [4] replace_sol
    @ ~/.julia/dev/DyadModelOptimizer/src/experiments/experiment_api.jl:181 [inlined]
  [5] (::Zygote.Pullback{Tuple{…}, Tuple{…}})(Δ::Tuple{Base.RefValue{…}, Matrix{…}})
    @ Zygote ~/.julia/packages/Zygote/55SqB/src/compiler/interface2.jl:0
  [6] replace_sol
    @ ~/.julia/dev/DyadModelOptimizer/src/experiments/experiment_api.jl:159 [inlined]
  [7] (::Zygote.Pullback{Tuple{…}, Tuple{…}})(Δ::Tuple{Base.RefValue{…}, Matrix{…}})
    @ Zygote ~/.julia/packages/Zygote/55SqB/src/compiler/interface2.jl:0
  [8] compute_error
    @ ~/.julia/dev/DyadModelOptimizer/src/experiments/experiment_api.jl:200 [inlined]
  [9] (::Zygote.Pullback{Tuple{…}, Tuple{…}})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/55SqB/src/compiler/interface2.jl:0
 [10] #cost_contribution#109
    @ ~/.julia/dev/DyadModelOptimizer/src/calibrate/multiple_shooting.jl:23 [inlined]
 [11] (::Zygote.Pullback{Tuple{…}, Any})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/55SqB/src/compiler/interface2.jl:0
 [12] cost_contribution
    @ ~/.julia/dev/DyadModelOptimizer/src/calibrate/multiple_shooting.jl:14 [inlined]
 [13] (::Zygote.Pullback{Tuple{…}, Tuple{…}})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/55SqB/src/compiler/interface2.jl:0
 [14] cost
    @ ~/.julia/dev/DyadModelOptimizer/src/objectives.jl:72 [inlined]
 [15] (::Zygote.Pullback{Tuple{DyadModelOptimizer.var"#cost#99"{…}, Vector{…}, Tuple{…}}, Any})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/55SqB/src/compiler/interface2.jl:0
 [16] (::Zygote.var"#88#89"{Zygote.Pullback{Tuple{}, Any}})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/55SqB/src/compiler/interface.jl:97
 [17] gradient(::Function, ::Vector{Float64}, ::Vararg{Any})
    @ Zygote ~/.julia/packages/Zygote/55SqB/src/compiler/interface.jl:154
 [18] gradient
    @ ~/.julia/packages/DifferentiationInterface/D3LUI/ext/DifferentiationInterfaceZygoteExt/DifferentiationInterfaceZygoteExt.jl:123 [inlined]
 [19] gradient!(f::Function, grad::Vector{…}, prep::DifferentiationInterface.NoGradientPrep{…}, backend::AutoZygote, x::Vector{…}, contexts::DifferentiationInterface.Constant{…})
    @ DifferentiationInterfaceZygoteExt ~/.julia/packages/DifferentiationInterface/D3LUI/ext/DifferentiationInterfaceZygoteExt/DifferentiationInterfaceZygoteExt.jl:139
 [20] (::OptimizationZygoteExt.var"#grad#14"{})(res::Vector{…}, θ::Vector{…})
    @ OptimizationZygoteExt ~/.julia/packages/OptimizationBase/Lc8sB/ext/OptimizationZygoteExt.jl:35
 [21] eval_objective_gradient(evaluator::OptimizationMOI.MOIOptimizationNLPEvaluator{…}, G::Vector{…}, x::Vector{…})
    @ OptimizationMOI ~/.julia/packages/OptimizationMOI/aKPgG/src/nlp.jl:259
 [22] eval_objective_gradient(model::IpoptMathOptInterfaceExt.Optimizer, grad::Vector{Float64}, x::Vector{Float64})
    @ IpoptMathOptInterfaceExt ~/.julia/packages/Ipopt/WfZ1b/ext/IpoptMathOptInterfaceExt/MOI_wrapper.jl:1163
 [23] (::IpoptMathOptInterfaceExt.var"#eval_grad_f_cb#8"{})(x::Vector{…}, grad_f::Vector{…})
    @ IpoptMathOptInterfaceExt ~/.julia/packages/Ipopt/WfZ1b/ext/IpoptMathOptInterfaceExt/MOI_wrapper.jl:1372
 [24] _Eval_Grad_F_CB(n::Int32, x_ptr::Ptr{Float64}, ::Int32, grad_f::Ptr{Float64}, user_data::Ptr{Nothing})
    @ Ipopt ~/.julia/packages/Ipopt/WfZ1b/src/C_wrapper.jl:56
 [25] #5
    @ ~/.julia/packages/Ipopt/WfZ1b/src/C_wrapper.jl:407 [inlined]
 [26] disable_sigint(f::Ipopt.var"#5#6"{Ipopt.IpoptProblem, Base.RefValue{Float64}})
    @ Base ./c.jl:167
 [27] IpoptSolve
    @ ~/.julia/packages/Ipopt/WfZ1b/src/C_wrapper.jl:406 [inlined]
 [28] optimize!(model::IpoptMathOptInterfaceExt.Optimizer)
    @ IpoptMathOptInterfaceExt ~/.julia/packages/Ipopt/WfZ1b/ext/IpoptMathOptInterfaceExt/MOI_wrapper.jl:1526
 [29] optimize!(b::MathOptInterface.Bridges.LazyBridgeOptimizer{IpoptMathOptInterfaceExt.Optimizer})
    @ MathOptInterface.Bridges ~/.julia/packages/MathOptInterface/vK6dk/src/Bridges/bridge_optimizer.jl:367
 [30] __solve(cache::OptimizationMOI.MOIOptimizationNLPCache{…})
    @ OptimizationMOI ~/.julia/packages/OptimizationMOI/aKPgG/src/nlp.jl:550
 [31] solve!(cache::OptimizationMOI.MOIOptimizationNLPCache{…})
    @ SciMLBase ~/.julia/packages/SciMLBase/URv1y/src/solve.jl:234
 [32] solve(::OptimizationProblem{…}, ::MathOptInterface.OptimizerWithAttributes; kwargs::@Kwargs{})
    @ SciMLBase ~/.julia/packages/SciMLBase/URv1y/src/solve.jl:131
 [33] macro expansion
    @ ./timing.jl:581 [inlined]
 [34] calibrate(prob::InverseProblem{…}, alg::SingleShooting{…}; adtype::AutoZygote, x0::Vector{…}, bounds::Tuple{…}, progress::Bool, parentid::Base.UUID, optimizer::MathOptInterface.OptimizerWithAttributes)
    @ DyadModelOptimizer ~/.julia/dev/DyadModelOptimizer/src/calibrate/calibrate.jl:38
 [35] top-level scope
    @ REPL[1]:1
Some type information was truncated. Use `show(err)` to see complete types.

cc @SebastianM-C

I am not a fan of adding a literal_getproperty dispatch but this can be a stopgap till we remove all similar ones at once.

Add any other context about the problem here.

@SebastianM-C
Copy link
Contributor

For a reproducible stack trace:

using ModelingToolkit

function reactionsystem()
    t = ModelingToolkit.t_nounits
    D = ModelingToolkit.D_nounits
    sts = @variables s1(t)=2.0 s1s2(t)=2.0 s2(t)=2.0
    ps = @parameters k1=1.0 c1=2.0 [bounds = (0, 2), tunable = true]
    eqs = [D(s1) ~ -0.25 * c1 * k1 * s1 * s2
           D(s1s2) ~ 0.25 * c1 * k1 * s1 * s2
           D(s2) ~ -0.25 * c1 * k1 * s1 * s2]

    return mtkcompile(System(eqs, t; name = :reactionsystem))
end

using SciMLSensitivity
using Zygote
using OrdinaryDiffEqTsit5
using SymbolicIndexingInterface
using DifferentiationInterface
using RecursiveArrayTools

sys = reactionsystem()

prob = ODEProblem(sys, [], (0, 1))
ts = range(0, 1, length = 10)
sol = solve(prob, Tsit5(), saveat = ts)
data = Matrix(sol)

get_vars = getu(sys, [sys.s1, sys.s1s2, sys.s2])
set_x = setsym_oop(sys, [sys.c1, sys.k1])

function squaredl2loss(sol::AbstractVectorOfArray, data)
    T = eltype(data)
    𝟘 = zero(promote_type(eltype(sol), T))
    err = 𝟘
    @assert size(sol, 1) == size(data, 1)
    @inbounds for (s, d) in zip(sol.u, eachcol(data))
        for i in eachindex(s, d)
            if !ismissing(d[i])
                err += (s[i] - d[i])^2
            else
                err += 𝟘
            end
        end
    end
    return err
end

function loss(x, (prob, get_vars, data, ts, set_x))
    new_u0, new_p = set_x(prob, x)
    new_prob = remake(prob, p = new_p, u0 = new_u0)
    new_sol = solve(new_prob, Tsit5(), saveat = ts)

    if SciMLBase.successful_retcode(new_sol)
        u = VectorOfArray(get_vars(new_sol))
        squaredl2loss(u, data)
    else
        Inf
    end
end

ps = (prob, get_vars, data, ts, set_x);

DifferentiationInterface.gradient(x -> loss(x, ps), AutoZygote(), [1.5, 2.0])

gives

ERROR: MethodError: no method matching size(::@NamedTuple{u::Vector{Vector{Float64}}})
The function `size` exists, but no method is defined for this combination of argument types.
You may need to implement the `length` method or define `IteratorSize` for this type to be `SizeUnknown`.

Closest candidates are:
  size(::IdentityOperator)
   @ SciMLOperators ~/.julia/packages/SciMLOperators/Gd0Qg/src/basic.jl:21
  size(::Graphs.DefaultDistance)
   @ Graphs ~/.julia/packages/Graphs/awp48/src/distance.jl:22
  size(::LLVM.FunctionParameterSet)
   @ LLVM ~/.julia/packages/LLVM/UFrs4/src/core/function.jl:200
  ...

Stacktrace:
  [1] (::RecursiveArrayToolsZygoteExt.var"#48#53")(y::@NamedTuple{u::Vector{Vector{Float64}}})
    @ RecursiveArrayToolsZygoteExt ~/.julia/packages/RecursiveArrayTools/EfvwE/ext/RecursiveArrayToolsZygoteExt.jl:79
  [2] (::RecursiveArrayToolsZygoteExt.var"#90#back#56"{RecursiveArrayToolsZygoteExt.var"#48#53"})(Δ::Base.RefValue{Any})
    @ RecursiveArrayToolsZygoteExt ~/.julia/packages/ZygoteRules/CkVIK/src/adjoint.jl:72
  [3] loss
    @ ~/.julia/dev/DyadModelOptimizer/test/calibrate/ad.jl:91 [inlined]
  [4] (::Zygote.Pullback{Tuple{typeof(loss), Vector{…}, Tuple{…}}, Any})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/55SqB/src/compiler/interface2.jl:0
  [5] #25
    @ ~/.julia/dev/DyadModelOptimizer/test/calibrate/ad.jl:100 [inlined]
  [6] (::Zygote.Pullback{Tuple{var"#25#26", Vector{…}}, Tuple{Zygote.var"#2006#back#208"{…}, Zygote.Pullback{…}}})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/55SqB/src/compiler/interface2.jl:0
  [7] (::Zygote.var"#88#89"{Zygote.Pullback{Tuple{…}, Tuple{…}}})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/55SqB/src/compiler/interface.jl:97
  [8] gradient(f::Function, args::Vector{Float64})
    @ Zygote ~/.julia/packages/Zygote/55SqB/src/compiler/interface.jl:154
  [9] gradient
    @ ~/.julia/packages/DifferentiationInterface/D3LUI/ext/DifferentiationInterfaceZygoteExt/DifferentiationInterfaceZygoteExt.jl:123 [inlined]
 [10] gradient(::var"#25#26", ::AutoZygote, ::Vector{Float64})
    @ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/D3LUI/src/first_order/gradient.jl:63
 [11] top-level scope

@DhairyaLGandhi
Copy link
Member Author

Cc @ChrisRackauckas

@ChrisRackauckas
Copy link
Member

I don't think there's an issue with narrowing the type for the backwards pass, but it would be good to define a part of the interface that is reconstruct_from_u or something that copies over the other fields.

@ChrisRackauckas ChrisRackauckas merged commit f8ebf8d into SciML:master Aug 14, 2025
25 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants