In this work, we discuss the Automatic Adjoint Differentiation (AAD) for functions of the form $G=frac{1}{2}sum_1^m (Ey_i-C_i)^2$, which often appear in the calibration of stochastic models. { We demonstrate that it allows a perfect SIMDfootnote{Single Input Multiple Data} parallelization and provide its relative computational cost. In addition we demonstrate that this theoretical result is in concordance with numeric experiments.}
↧