Skip to content

Conversation

@ChrisRackauckas-Claude
Copy link

Summary

  • Fix Core.Box closure capture issues in instantiate_function methods that caused type instability
  • Rewrite all closures to use let blocks for proper variable capture and type inference
  • Variables like prep_grad, prep_hess, etc. were being boxed because Julia couldn't prove they wouldn't be reassigned

Problem

The original code had closure patterns like:

if g == true && f.grad === nothing
    prep_grad = prepare_gradient(f.f, adtype, x, Constant(p))
    function grad(res, θ)
        return gradient!(f.f, res, prep_grad, adtype, θ, Constant(p))
    end
    # ... more code that might reassign prep_grad
end

Julia wraps prep_grad in Core.Box because it cannot prove the variable won't be reassigned later in the scope. This causes:

  1. Return type inferred as ANY instead of Vector{Float64}
  2. Heap allocations on every gradient call
  3. Dynamic dispatch overhead

Solution

Use let blocks to create new bindings:

grad = if g == true && f.grad === nothing
    _prep_grad = prepare_gradient(f.f, adtype, x, Constant(p))
    let _prep_grad = _prep_grad, f = f, adtype = adtype, p = p
        (res, θ) -> gradient!(f.f, res, _prep_grad, adtype, θ, Constant(p))
    end
    # ...
end

Verification

Before fix (@code_warntype showed):

Body::Any
│   %46 = Core.Box::Type{Core.Box}
│   %47 = (%46)(%45)::Core.Box

After fix (@code_warntype shows):

Body::Vector{Float64}

Benchmark improvement:

# Gradient computation now shows 0 allocations in steady state:
@btime $grad!($G, $x)
  61.255 ns (0 allocations: 0 bytes)

Testing

  • All 703 OptimizationBase tests pass
  • Tested with ForwardDiff, ReverseDiff AD backends
  • Basic optimization (BFGS, LBFGS) verified to converge correctly

cc @ChrisRackauckas

🤖 Generated with Claude Code

The `instantiate_function` methods in OptimizationDIExt.jl had closure
capture issues where variables like `prep_grad`, `prep_hess`, etc. were
being captured by closures but could potentially be reassigned later in
the same function scope. Julia's compiler could not prove these variables
wouldn't be reassigned, so it wrapped them in Core.Box causing:

1. Type instability (return type inferred as ANY instead of Vector{Float64})
2. Heap allocations on every gradient/hessian call
3. Slower execution due to dynamic dispatch

This fix rewrites all closures to use `let` blocks which create new
local bindings, ensuring type stability. The pattern used is:
```julia
let var = var, f = f, adtype = adtype
    (args...) -> use_captured_variables(var, f, adtype, args...)
end
```

Verified improvements:
- @code_warntype now shows proper return type inference (Vector{Float64})
- Gradient computation shows 0 allocations in steady state
- All 703 OptimizationBase tests pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <[email protected]>
@ChrisRackauckas ChrisRackauckas merged commit 46819c1 into SciML:master Jan 8, 2026
64 of 81 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants