@def title = "JuliaDiff" @def tags = ["JuliaLang", "AutoDiff", "AD"]
@@topmatter
Differentiation tools in Julia @@
Derivatives are required by many numerical algorithms, even for functions
- Symbolic differentiation, which uses computer algebra to work out explicit formulas
-
Numerical differentiation, which relies on variants of the finite difference approximation
$f'(x) \approx \frac{f(x+\varepsilon) - f(x)}{\varepsilon}$ -
Automatic differentiation (AD), which reinterprets the code of
$f$ either in "forward" or "reverse" mode
In machine learning, automatic differentiation is probably the most widely used paradigm, especially in reverse mode. However, each method has its own upsides and tradeoffs, which are detailed in the following papers, along with implementation techniques like "operator overloading" and "source transformation":
Automatic differentiation in machine learning: a survey, Baydin et al. (2018)
A review of automatic differentiation and its efficient implementation, Margossian (2019)
JuliaDiff is an informal GitHub organization which aims to unify and document packages written in Julia for evaluating derivatives. The technical features of Julia1 make implementing and using differentiation techniques easier than ever before (in our biased opinion).
Discussions on JuliaDiff and its uses may be directed to the Julia Discourse forum. The ChainRules project maintains a list of recommended reading for those after more information. The autodiff.org site serves as a portal for the academic community, though it is often out of date.
What follows is a big list of Julia differentiation packages and related tooling, last updated in January 2024. If you notice something inaccurate or outdated, please open an issue to signal it. The packages marked as inactive are those which have had no release in 2023.
The list aims to be comprehensive in coverage. By necessity, this means it is not comprehensive in detail. It is worth investigating each package yourself to really understand its ins and outs, and the pros and cons of its competitors.
- JuliaDiff/ReverseDiff.jl: Operator overloading AD backend
- FluxML/Zygote.jl: Source transformation AD backend
- EnzymeAD/Enzyme.jl: LLVM-level source transformation AD backend
- FluxML/Tracker.jl: Operator overloading AD backend
- compintell/Tapir.jl: Source transformation AD backend (experimental)
- dfdx/Yota.jl: Source transformation AD backend
- JuliaDiff/ForwardDiff.jl: Operator overloading AD backend
- JuliaDiff/PolyesterForwardDiff.jl: Multithreaded version of ForwardDiff.jl
- EnzymeAD/Enzyme.jl: LLVM-level source transformation AD backend
- JuliaDiff/Diffractor.jl: Source transformation AD backend (experimental)
- JuliaSymbolics/Symbolics.jl: Pure Julia computer algebra system with support for fast and sparse analytical derivatives
- brianguenter/FastDifferentiation.jl: Generate efficient executables for symbolic derivatives
- JuliaDiff/FiniteDifferences.jl: Finite differences with support for arbitrary types and higher order schemes
- JuliaDiff/FiniteDiff.jl: Finite differences with support for caching and sparsity
- JuliaDiff/TaylorSeries.jl: Taylor polynomial expansions in one or more variables
- JuliaDiff/TaylorDiff.jl: Higher order directional derivatives (experimental)
- JuliaDiff/Diffractor.jl: Source transformation AD backend (experimental)
- gdalle/DifferentiationInterface.jl: Generic interface for first- and second-order differentiation with any AD backend on 1-argument functions (
f(x) = y
orf!(y, x)
). - JuliaDiff/AbstractDifferentiation.jl: Generic interface for first- and second-order differentiation with a subset of AD backends on functions with more than one argument (will soon wrap DifferentiationInterface.jl).
These packages define derivatives for basic functions, and enable users to do the same:
- JuliaDiff/ChainRules: Ecosystem for backend-agnostic forward and reverse rules.
- JuliaDiff/ChainRulesCore.jl: Core API for users to add rules to their package.
- JuliaDiff/ChainRules.jl: Rules for Julia Base and standard libraries.
- JuliaDiff/ChainRulesTestUtils.jl: Tools for testing rules defined with ChainRulesCore.jl.
- ThummeTo/ForwardDiffChainRules.jl: Translate rules from ChainRulesCore.jl to make them compatible with ForwardDiff.jl
- JuliaDiff/DiffRules.jl: Scalar rules used by e.g. ForwardDiff.jl, ReverseDiff.jl, Tracker.jl, and Symbolics.jl
- EnzymeAD/EnzymeRules.jl: Rule definition API for Enzyme.jl
- FluxML/ZygoteRules.jl: Some rules used by Zygote.jl (mostly deprecated in favor of ChainRules.jl).
- JuliaDiff/SparseDiffTools.jl: Exploit sparsity to speed up FiniteDiff.jl and ForwardDiff.jl, as well as other algorithms.
- adrhill/SparseConnectivityTracer.jl: Sparsity pattern detection for Jacobians and Hessians.
- gdalle/SparseMatrixColorings.jl: Efficient coloring and and decompression algorithms for sparse Jacobians and Hessians.
Some complex algorithms are not natively differentiable, which is why derivatives have been implemented in the following packages:
- SciML: For a lot of different domains of scientific machine learning: differential equations, linear and nonlinear systems, optimization problems, etc.
- gdalle/ImplicitDifferentiation.jl: For generic algorithms specified by output conditions, thanks to the implicit function theorem
- jump-dev/DiffOpt.jl: For convex optimization problems
- axelparmentier/InferOpt.jl: For combinatorial optimization problems
- gaurav-arya/StochasticAD.jl: Differentiation of functions with stochastic behavior (experimental)
- denizyuret/AutoGrad.jl
- dfdx/XGrad.jl
- dpsanders/ReversePropagation.jl
- GiggleLiu/NiLang.jl
- invenia/Nabla.jl
- JuliaDiff/DualNumbers.jl
- JuliaMath/Calculus.jl
- rejuvyesh/PyCallChainRules.jl
- SciML/SparsityDetection.jl
- YingboMa/ForwardDiff2
Footnotes
-
namely, multiple dispatch, source code via reflection, just-in-time compilation, and first-class access to expression parsing ↩