iTranslated by AI
JuliaHEP 2025 Workshop Report
I am putting together a report on the JuliaHEP 2025 Workshop, which I attended in late July alongside JuliaCon 2025.
From JuliaCon 2025 to JuliaHEP 2025 Workshop
From July 22–26, 2025, the Julia language international conference "JuliaCon 2025" was held in Pittsburgh, USA. You can find a report on that here. Subsequently, from July 28–31, 2025, the international workshop "JuliaHEP 2025 Workshop" was held in Princeton, USA.
To get there, I flew from Pittsburgh International Airport (PIT) to Newark Liberty International Airport (EWR) and then took a train to Princeton Station, which is the closest station to Princeton University. Google Maps suggested a route that involved transferring to a bus halfway, but as it turned out, the bus stop didn't actually exist, and I had to return to the train. For anyone planning to visit Princeton University, please be careful. Also, note that you must transfer at Princeton Junction Station because the tracks are not directly connected. It was a charming, single-track train.

Venue
The workshop was held at Princeton University. The campus was lined with magnificent buildings that looked like they could have appeared in Harry Potter. There were also squirrels 🐿

Other photos



Presentations
Here are some selected research presentations. A glimpse of the presentations:
Julia in HEP: diff 2024 2025
There was a mention of NumericalDistributions.jl. While Distributions.jl is used by choosing from pre-implemented distribution functions, this package seems to allow users to define and use their own distribution functions. It appears to be a very user-friendly package if you are not strictly pursuing speed or precision.
Julia at Princeton
According to the reported materials, there are more than 100 users at Princeton University alone. Japan cannot afford to fall behind!
Hands-on session: Introduction to Julia for scientists
Titled "Introduction to Julia for scientists." I confirmed with the author, and since it is open source, they said it is perfectly fine to translate it into Japanese.
PrecisionCarriers.jl: Easy Detection of Floating Point Precision Loss
The talk discussed topics that are very important when evaluating high-precision arithmetic packages such as MultiFloats.jl.
ML for HEP Analysis in Julia
This is a tutorial on machine learning in High Energy Physics (HEP). There were several useful packages mentioned.
Efficient Matrix-Element Generation in Julia
There was a mention of KernelAbstractions.jl. It seems that NVIDIA CUDA, AMD ROCm, Intel oneAPI, and Apple Metal can be used within the same framework.
Miscellaneous
I had been struggling to find the best practices for calculating second-order derivatives using automatic differentiation, but I was introduced to DifferentiationInterface.jl. It allows the use of backends like Zygote and Enzyme within a unified framework and provides support for second-order derivatives. I have written an article about it.
Conclusion
Beyond the presentations, I received valuable advice on performance, high-precision computing, and automatic differentiation. In particular, DifferentiationInterface.jl is quite impactful, and I hope it becomes more widely known in Japan.
Related Articles
Discussion