Course Feedback
Automatic Differentiation Training, Sophia Antipolis February 2017
The Automatic/Algorithmic Differentiation (AD) training was presented by Laurent Hascoet (INRIA), Andrea Walther and Kshitij Kulshreshtha (University of Paderborn). The training covered both theoretical aspects and practical examples. The lectures explained the core part of an AD software tool - forward and reverse mode differentiation, and how one can apply it to a computer program in order to obtain necessary derivatives (gradients) of any order, i.e., first, second and higher-order derivatives.
Another very interesting part of the training was related to the "hands-on" sessions. Two very common AD tools used in IODA project, i.e, ADOL-C and Tapenade (operator-overloading vs. source-transformation concepts), were applied and demonstrated on many examples. Each participant had the opportunity to install the AD tools on their own workstations and to develop and compile all the examples. The AD training was constructive both for beginners and ESRs that already have some experience in the area of AD. This was especially useful for the ESRs that are planning on integrating AD tools into their existing optimisation work-flows.
Mladen Banovic