Sensitivity, robustness, and identifiability in stochastic chemical kinetics models

Michał Komorowski, Maria J Costa, David A Rand, Michael P H Stumpf, Karen Halliday (Group Leader)

Research output: Contribution to journalArticlepeer-review

Abstract

We present a novel and simple method to numerically calculate Fisher information matrices for stochastic chemical kinetics models. The linear noise approximation is used to derive model equations and a likelihood function that leads to an efficient computational algorithm. Our approach reduces the problem of calculating the Fisher information matrix to solving a set of ordinary differential equations. This is the first method to compute Fisher information for stochastic chemical kinetics models without the need for Monte Carlo simulations. This methodology is then used to study sensitivity, robustness, and parameter identifiability in stochastic chemical kinetics models. We show that significant differences exist between stochastic and deterministic models as well as between stochastic models with time-series and time-point measurements. We demonstrate that these discrepancies arise from the variability in molecule numbers, correlations between species, and temporal correlations and show how this approach can be used in the analysis and design of experiments probing stochastic processes at the cellular level. The algorithm has been implemented as a Matlab package and is available from the authors upon request.
Original languageEnglish
Pages (from-to)8645-50
Number of pages6
JournalProceedings of the National Academy of Sciences
Volume108
Issue number21
Early online date6 May 2011
DOIs
Publication statusPublished - 24 May 2011

Keywords

  • Algorithms
  • Kinetics
  • Models, Biological
  • Monte Carlo Method
  • Stochastic Processes
  • stochasticity
  • systems biology
  • parameter estimation

Fingerprint

Dive into the research topics of 'Sensitivity, robustness, and identifiability in stochastic chemical kinetics models'. Together they form a unique fingerprint.

Cite this