Closed
Description
Hi there!
@adrhill and I recently started https://github.com/gdalle/DifferentiationInterface.jl to provide a common interface for automatic differentiation in Julia. We're currently chatting with Lux.jl, Flux.jl and Optimization.jl to see how they can benefit from it, and so my mind went to Turing.jl as another AD power user :)
DifferentiationInterface.jl only guarantees support for functions of the type f(x) = y
or f!(y, x)
with standard numbers or arrays in and out. Within these restrictions, we are compatible with 13 different AD backends, including the cool kids like Enzyme.jl and even the hipsters like Tapir.jl. Do you think it could come in handy?
Ping @yebai @willtebbutt
Metadata
Metadata
Assignees
Labels
No labels