iTranslated by AI
Computing the Laplacian with Automatic Differentiation in Julia
In this article, we will calculate gradients and Laplacians using ForwardDiff.jl and Zygote.jl, which are automatic differentiation packages for the Julia language. As of January 5, 2025, Enzyme.jl does not have a Hessian, so I have excluded it from the candidates. Since ForwardDiff.jl and Zygote.jl have Hessians implemented, the Laplacian can be easily calculated by taking the trace. Since only the diagonal components of the Hessian are needed, this could definitely be sped up, but I'll give up on speed because it's a lot of work to implement. According to Mr. Nagai, Zygote.jl provides diaghessian. ForwardDiff.jl might have a Laplacian implementation in the future.
Gradients and Laplacians
The gradient can be calculated using gradient, and the Laplacian can be calculated by taking the trace of the hessian.
import LinearAlgebra
import ForwardDiff
f(x) = x[1]^3 + 2*x[2]^4 + 3*x[2]^5
∇f(x) = ForwardDiff.gradient(f, x)
∇²f(x) = LinearAlgebra.tr(ForwardDiff.hessian(f, x))
@show ∇f([1.0, 1.0, 1.0])
@show ∇²f([1.0, 1.0, 1.0])
# ∇f([1.0, 1.0, 1.0]) = [3.0, 23.0, 0.0]
# ∇²f([1.0, 1.0, 1.0]) = 90.0
In the case of Zygote.jl, it can also be calculated as the sum of diaghessian.
import LinearAlgebra
import Zygote
f(x) = x[1]^3 + 2*x[2]^4 + 3*x[2]^5
∇f(x) = first(Zygote.gradient(f, x))
# ∇²f(x) = LinearAlgebra.tr(Zygote.hessian(f, x))
∇²f(x) = sum(first(Zygote.diaghessian(f, [1.0, 1.0, 1.0])))
@show ∇f([1.0, 1.0, 1.0])
@show ∇²f([1.0, 1.0, 1.0])
# ∇f([1.0, 1.0, 1.0]) = [3.0, 23.0, 0.0]
# ∇²f([1.0, 1.0, 1.0]) = 90.0
Speed Comparison
We compare the calculation speed of the Laplacian for the following four functions:
The version using Zygote.jl's diaghessian was the fastest.
import BenchmarkTools
import ForwardDiff
import Zygote
f(x) = x[1]^3 + 2*x[2]^4 + 3*x[2]^5
BenchmarkTools.@btime LinearAlgebra.tr(Zygote.hessian(f, [1.0, 1.0, 1.0]))
BenchmarkTools.@btime LinearAlgebra.tr(Zygote.hessian_reverse(f, [1.0, 1.0, 1.0]))
BenchmarkTools.@btime sum(first(Zygote.diaghessian(f, [1.0, 1.0, 1.0])))
BenchmarkTools.@btime LinearAlgebra.tr(ForwardDiff.hessian(f, [1.0, 1.0, 1.0]))
# 309.426 ns (15 allocations: 832 bytes)
# 393.900 μs (2939 allocations: 247.36 KiB)
# 271.617 ns (14 allocations: 656 bytes)
# 472.959 ns (12 allocations: 1.41 KiB)
Verification
We will calculate the gradient and Laplacian of a Gaussian function and verify if they match the analytically derived formulas below. The x-component of the gradient can be calculated as follows. The same applies to the y and z components.
import Zygote
f(x) = exp(-(x[1]^2 + x[2]^2 + x[3]^2))
∇f(x) = first(Zygote.gradient(f, x))
∇f([1.0, 1.0, 1.0])
# 3-element Vector{Float64}:
# -0.09957413673572789
# -0.09957413673572789
# -0.09957413673572789
∇f_analytical(x; a=1.0) = [
-2*a*x[1]*exp(-a*(x[1]^2 + x[2]^2 + x[3]^2))
-2*a*x[2]*exp(-a*(x[1]^2 + x[2]^2 + x[3]^2))
-2*a*x[3]*exp(-a*(x[1]^2 + x[2]^2 + x[3]^2))
]
∇f_analytical([1.0, 1.0, 1.0])
# 3-element Vector{Float64}:
# -0.09957413673572789
# -0.09957413673572789
# -0.09957413673572789
The Laplacian for the Gaussian function can be calculated as follows.
import Zygote
f(x) = exp(-(x[1]^2 + x[2]^2 + x[3]^2))
∇²f(x) = sum(first(Zygote.diaghessian(f, [1.0, 1.0, 1.0])))
∇²f([1.0, 1.0, 1.0])
# 0.29872241020718365
# Analytical Laplacian
∇²f_analytical(x; a=1.0) = (-6*a + 4*a^2*(x[1]^2 + x[2]^2 + x[3]^2)) * exp(-a * (x[1]^2 + x[2]^2 + x[3]^2))
∇²f_analytical([1.0, 1.0, 1.0])
# 0.29872241020718365
Summary
Gradients and Laplacians were successfully calculated using ForwardDiff.jl and Zygote.jl, respectively. The results obtained match the analytically calculated gradient and Laplacian of the Gaussian function.
Version information
Julia v1.11.2
ForwardDiff v0.10.38
Zygote v0.6.75
Related Articles
Discussion