Abstract:
Biochemical reaction networks (BCRNs), which describe the chemical reactions occurring between molecules inside a living cell, play a central role in systems biology and related areas. Mathematical and stochastic modelling together with computer simulations has long been recognized as important approaches in studying many aspects of such complex networks. In particular, we consider an elementary stochastic model for protein production and degradation. This simple model is also a component of more complex models for gene regulatory networks which is a particular family of BCRNs. It is known that this stochastic model is a poisson process that falls into the category of Markov processes which is a very important class of stochastic processes. In the present simplified model, the protein is produced at a constant rate 𝑘1, while it is degraded with rate 𝑘2. The equilibrium distribution of this simple network is a Poisson distribution with the parameter determined by 𝑘1 and 𝑘2. For different values of 𝑘1 and 𝑘2, we have a family of such networks each having a Poisson distribution, that is, we identify each network by its distribution. Thus, we construct a differentiable manifold having such simple protein networks (Poisson distributions) as elements and study their properties from information theoretical and geometrical points of view. We first derive some information functionals of the Poisson distribution namely, entropy, relative entropy (a.k.a. Kullback Leibler divergence), and Fisher information matrix. We elaborate on the relationship between relative entropy and Fisher information matrix by expanding the relative entropy functional in Taylor's series. The exponential family structure of the Poisson distributions is discussed demonstrating the importance of this concept for many good intrinsic properties such as parameter estimation of those distributions. In this new setting, we elucidate that entropy as a measure of the complexity of the network under consideration. Geometrically, the relationship between Fisher information and relative entropy reveals that it is appropriately used for measuring the sensitivity of the parameters. This should be clear from the facts that the relative entropy can be used as a measure of the distance between two distributions and Fisher information can be used to measure the efficiency of an estimated parameter.