We develop a systematic information-theoretic framework for quantification and mitigation of error in probabilistic Lagrangian (i.e., trajectory-based) predictions which are obtained from (Eulerian) vector fields generating the underlying dynamical system in a way which naturally applies in both deterministic and stochastic settings. This work is motivated by the desire to improve Lagrangian predictions in complex, multi-scale systems based on simplified, data-driven models. Here, discrepancies between probability measures μ and ν associated with the true dynamics and its approximation are quantified via so-called φ-divergencies, Dφ(μ∥ν), which are premetrics defined by a class of strictly convex functions φ. We derive general information bounds on the uncertainty in estimates, Eν[f], of `true' observables Eμ[f] in terms of φ-divergencies; we then derive two distinct bounds on Dφ(μ∥ν) itself. First, an analytically tractable bound on Dφ(μ∥ν) is derived from differences between vector fields generating the true dynamics and its approximations. The second bound on Dφ(μ∥ν) is based on a difference of so-called finite-time divergence rate (FTDR) fields and it can be exploited within a computational framework to mitigate the error in Lagrangian predictions by tuning the fields of expansion rates obtained from simplified models. This new framework provides a systematic link between Eulerian (field-based) model error and the resulting uncertainty in Lagrangian (trajectory-based) predictions.
|Number of pages||61|
|Journal||SIAM/ASA Journal on Uncertainty Quantification|
|Publication status||Accepted/In press - 30 Jun 2021|