11-15 September 2023
Budker INP
Asia/Novosibirsk timezone

Differentiable Accelerator Modeling Library

12 Sep 2023, 17:00
1h 30m
Board: 076
Poster Beam dynamics, beam cooling methods, new approaches... Posters I

Speaker

Ivan Morozov

Description

Application of automatic differentiation (AD) has a long history in accelerator modeling. Primarily, forward mode AD is used to construct Taylor approximations for one-turn transformations, computation of normal forms, and various other tasks. In this scenario, the differentiable variables correspond to phase space coordinates. It is also feasible to utilize knob parameters as differentiable variables to calculate the parametric dependence of various significant quantities. The continuous advancement of machine learning (ML) tools has led to the emergence of diverse and comprehensive libraries that facilitate the implementation of AD. This paper presents a detailed description of a differentiable accelerator modeling library that is built upon the PyTorch framework. The library provides the capability to calculate higher-order partial derivatives with respect to one or several tensor-like variables. This functionality enables the computation of parametric-dependent fixed points, coupled twiss parameters, Taylor transport mappings, invariants, and various other observables. Computation of parametric closed orbit and twiss parameter allows to construct response matrices and can be used in sensitivity analysis. Gradient-based methods can be effectively employed for optics matching, as the derivatives are readily accessible. First order phase space derivatives can be used for chaos identification. The computation of Hessians for objective functions enables informed initializations in derivative-free optimization. Higher-order derivatives can be utilized to construct surrogate Taylor models of transport mappings or other observables.

Primary author

Presentation Materials

There are no materials yet.