1
Stationary and dynamical properties of information entropies in nonextensive systems 1
arXiv:0711.3923v2 [cond-mat.stat-mech] 2 Dec 2007
Hideo Hasegawa
2
Department of Physics, Tokyo Gakugei University Koganei, Tokyo 184-8501, Japan (May 23, 2008) Abstract The Tsallis entropy and generalized Fisher information entropy (matrix) are very important quantities expressing information measures in nonextensive systems. Properties of the information entropies have been investigated for two types of probability distributions: (a) the distribution derived from the maximum-entropy method, and (b) that obtained by the Fokker-Planck equation (FPE) method for the N -unit coupled Langevin model subjected to additive and multiplicative noise, which is one of typical nonextensive systems. Although the generalized Fisher information matrix is chosen to be in conformity with the Tsallis entropy, it has been shown for the case (a) that the Cram´er-Rao inequality in nonextensive systems is not expressed in terms of the generalized Fisher information matrix but in terms of the alternative modified Fisher information matrix. For the case (b), we have made detailed, analytical and numerical study on the dependences of the stationary-state entropies on magnitudes of additive and multiplicative noise, external inputs, couplings and number of constitutive elements (N ). By solving the time-dependent FPE by the partial difference equation method, transient responses of the information entropies have been investigated to changes in magnitudes of input signal, external forces and additive and multiplicative noise. We have discussed also the entropy flux, entropy production, correlations among the entropies and variance of state variables, and the difference and similarity between results calculated for the two distributions (a) and (b).
PACS No. 05.10.Gg, 05.45.-a
1 2
E-print: arXiv 0711.3923 E-mail address:
[email protected] 1
INTRODUCTION
In the last half century, considerable study has been made on the Boltzman-GibbsShannon entropy and the Fisher information entropy (matrix), both of which play important roles in thermodynamics and statistical mechanics of classical and quantum systems [1]-[7]. The entropy flux and entropy production have been investigated in connection with the space volume contraction [2]. The Fisher information matrix gives the lower bound of fluctuations by the Cram´er-Rao theorem. In the information geometry [8], the Fisher information matrix provides us with the distance between the neighboring points in the Rieman space spanned by probability distributions. In a usual system consisting of N particles, the entropy and energy are proportional to N (extensive), and the probability distribution is given by the Gaussian distribution belonging to the exponential family. In recent year, however, much efforts have been made for a study on nonextensive systems in which the physical quantity of N particles is not proportional to N [9, 10, 11]. The nonextensivity has been realized in various systems such as a system with long-range interactions, a small-scale system with large fluctuations in temperature and a multifractal system [11, 12]. Tsallis has proposed the generalized entropy (called the Tsallis entropy hereafter) defined by [9, 10] Z k 1 − p(x, t)q dx , (q − 1)
Sq (t) =
= −k
Z
(1)
p(x, t)q lnq p(x, t) dx,
(2)
where q is the entropic index, p(x, t) denotes the probability distribution of a state x, the Boltzman constant k is hereafter unity and lnq x expresses the q-logarithmic function defined by lnq x ≡ (1 − x1−q )/(q − 1). The Tsallis entropy accounts for the nonextensivity
of the entropy in nonextensive systems. The probability distribution derived by the maximum-entropy method with the use of the Tsallis entropy is generally given by nonGaussian distribution [11]. In the limit of q → 1, lnq x reduces to the normal ln x and then Sq (t) agrees with the Boltzman-Gibbs-Shannon entropy expressed by S1 (t) = −
Z
p(x, t) ln p(x, t) dx.
(3)
For nonextensive systems, several authors have proposed the generalized Fisher information matrix G whose components are given by [13]-[18] gij = gji = q
Z
∂ ln p(x) p(x) ∂θi 2
!
∂ ln p(x) ∂θj
!
dx,
(4)
where p(x) = p(x; {θi }) and {θi } denotes a set of parameters specifying the distribution.
In order to derive the generalized Fisher information matrix, the distance of D(p | p′ )
between the two distributions p and p′ has been introduced:
D(p | p′ ) = K(p | p′ ) + K(p′ | p),
(5)
where the generalized Kullback-Leibler relative entropy K(p | p′ ) is defined by ′
K(p | p ) =
Z
p(x)q [lnq p(x) − lnq p′ (x)] dx,
1 = − 1− (q − 1)
Z
q
′
p(x) p (x)
1−q
dx .
(6)
Note that the generalized Kullback-Leibler relative entropy, which is in conformity with the Tsallis entropy, is equivalent to the α-divergence of Amari [8] with q = (1 − α)/2 [19, 20]. It has been shown that the generalized Fisher information matrix is extensive
even in the nonextensive system [19]. The escort probability and the generalized Fisher information matrix are discussed in Refs. [13, 14]. In the limit of q → 1, gij defined by Eq. (4) reduces to the conventional Fisher information matrix.
The purpose of the present paper is to investigate the stationary and dynamical properties of the Tsallis entropy and generalized Fisher information matrix. We have adopted two types of probability distributions: (a) the distribution derived from the maximumentropy method and (b) that obtained from the Fokker-Planck equation (FPE) method for the N-unit coupled Langevin model subjected to additive and multiplicative noise, which is one of typical nonextensive systems [11]. By applying the mean-field approximation to the coupled Langevin model in the case (b), we have studied effects on the Tsallis entropy and generalized Fisher information matrix of additive and multiplicative noise, external force, input signal, couplings and the number of constituent elements in the adopted model. By solving the FPE by the partial difference method, we have investigated the transient responses to changes in these quantities which are applied to the stationary state. Such calculations are worthwhile although they have not been reported so far, as far as we are aware of. The outline of the paper is as follows. In Sec. II, the Tsallis and generalized entropies are calculated with the probability distributions derived from the maximumentropy method. The Cram´er-Rao inequality in nonextensive systems is discussed. In Sec. III, we describe the adopted, coupled Langevin model, for which analytical expressions for the Tsallis entropy and generalized Fisher information entropy in some limiting 3
cases are presented. In Sec. IV, numerical model calculations of stationary and dynamical entropies of the coupled Langevin model are presented. In Sec. V, we discuss correlations among the two entropies and fluctuations, as well as the entropy flux and entropy production. Sec. VI is devoted to our conclusion.
2 2.1
Maximum-entropy method Probability distribution
We will discuss, in this section, the Tsallis entropy and generalized Fisher information matrix in nonextensive systems whose distribution function p(x) is obtained by the maximum-entropy method. The variational condition for the Tsallis entropy given by Eq. (1) is taken into account with the three constraints given by [21] 1 =
Z
p(x) dx,
µ = Eq [x] =
Z
(7) Pq (x) x dx,
(8)
Z
(9)
σ 2 = Eq [(x − µ)2 ] =
Pq (x) (x − µ)2 dx,
where Eq [·] expresses the average over the escort probability of Pq (x) given by Pq (x) = cq =
p(x)q , cq
(10)
Z
(11)
p(x)q dx,
the entropic index q being assumed to be 0 < q < 3. After some manipulations, we get the non-Gaussian (q-Gaussian) distribution given by [21] 1 (x − µ)2 p(x) = , expq − Zq 2νσ 2 "
#
(12)
with ν =
Zq =
Z
3−q . 2 "
expq
(x − µ)2 − 2νσ 2
2νσ 2 = q−1 √ = 2πσ,
!1/2
2νσ 2 1−q
!1/2
=
(13) #
dx,
(14) !
1 1 1 , , − B 2 q−1 2 B
!
1 1 , +1 , 2 1−q 4
for 1 < q < 3
(15)
for q = 1
(16)
for 0 < q < 1
(17)
Here B(a, b) denotes the beta function, and expq (x) stands for the q-exponential function 1/(1−q)
defined by expq (x) ≡ [1 + (1 − q)x]+
where [y]+ = y for y ≥ 0 and 0 for y < 0.
In the limit of q → 1, expq (x) reduces to the normal exponential function of ex and the probability distribution p(x) in Eq. (12) becomes the Gaussian distribution: p(x) = √
2.2
1 2 2 e−(x−µ) /2σ . 2πσ
(18)
Tsallis entropy
With the use of Eqs. (1) and (12), the Tsallis entropy is given by Sq =
1 [1 + ln(2πσ 2 )], 2 ! 1 − cq , q−1
=
for q = 1
(19)
for q 6= 1
(20)
!
for 1 < q < 3
(21)
!
for 0 < q < 1
(22)
with cq
1 = Zqq 1 = Zqq
2νσ 2 q−1 2νσ 2 1−q
!1/2 !1/2
1 1 q , , − B 2 q−1 2
1 q , +1 , B 2 1−q
which yield cq = ν Zq1−q .
for 0 < q < 3
(23)
Here Zq for 0 < q < 1 and 1 < q < 3 are given by Eqs. (15) and (17), respectively.
2.3
Generalized Fisher information matrix
The distribution p(x) given by Eq. (12) is assumed to be characterized by two parameters of (θ1 , θ2 ) = (µ, σ 2 ). By using Eqs. (4) and (12), we obtain the component of the generalized Fisher information matrix G given by [13]-[18] gij
"
!
!#
∂ ln p(x) ∂ ln p(x) , = qE ∂θi ∂θj = q E[(Xi − E[Xi ])(Xj − E[Xj ])],
(24) for i, j = 1, 2
(25)
with (x − µ)2 ∂ ln expq − Xi = ∂θi 2νσ 2 "
5
!#
,
(26)
where E[·] denotes the average over the q-Gaussian distribution of p(x) whereas Eq [·] stands for the average over the escort distribution of Pq (x). Substituting the probability given by Eq. (12) to Eq. (24), we get g11 = = = =
!2
!2
Z ∂ ln p(x) ∂ ln p(x) dx = q p(x) dx, q p(x) ∂µ ∂x ! 1 B( 23 , (q−1) + 12 ) 2q , for 1 < q < 3 1 νσ 2 (q − 1) B( 21 , (q−1) − 21 ) 1 , for q = 1 σ2 ! 1 B( 23 , (1−q) − 1) 2q , for 0 < q < 1 1 + 1) νσ 2 (1 − q) B( 21 , (1−q) Z
(27) (28) (29) (30)
which yield 1 . σ2
g11 =
for 0 < q < 3
(31)
A similar calculation leads to the (2,2)-component given by g22
!2
∂ ln p(x) = q p(x) dx, ∂σ 2 3−q . for 0 < q < 3 = 4σ 4 Z
(32) (33)
The generalized Fisher information matrix is expressed by 1 σ2
G=
0 (3−q) 4σ4
0
!
,
whose inverse is given by −1
G
σ2 0
=
0 4σ4 (3−q)
!
.
In the limit of q = 1, the matrix reduces to G=
2.4
1 σ2
0
0 1 2σ4
!
.
for q = 1
Cram´ er-Rao inequality
Next we discuss the Cram´er-Rao inequality in nonextensive systems. For the escort distribution given by Eq. (10) which satisfies Eqs. (8) and (9) with 1 = Eq [1] =
Z
6
Pq (x) dx,
(34)
we get the Cram´er-Rao inequality [1, 14, 15] ˜ −1 . V≥G
(35)
Here V denotes the covariance error matrix whose explicit expression will be given shortly, ˆ is referred to as the modified Fisher information matrix whose components are and G expressed by g˜ij
!
"
!#
∂ ln Pq (x) ∂ ln Pq (x) , = Eq ∂θi ∂θj h i ˜ i − Eq [X ˜ i ])(X ˜ j − Eq [X ˜ j ]) , = Eq (X
for i, j = 1, 2
(36) (37)
with ∂ [q ln p(x)], ∂θi = q(Xi − E[Xi ]),
˜i = X
(38) (39)
Xi being given by Eq. (26). Note that g˜ij is different from gij given by Eq. (24) except ˆ is given by for q = 1.0. The (1,1) component of G g˜11 = = = = =
!2 ∂ ln Pq (x) , Eq
q2 cq
(40)
∂µ
!Z
!2
∂ ln p(x) dx. p(x) ∂x ! q B( 23 , (q−1) + 21 ) 2q 2 , q νσ 2 (q − 1) B( 12 , (q−1) − 21 ) 1 , σ2 ! q B( 32 , (1−q) − 1) 2q 2 , q νσ 2 (1 − q) B( 21 , (1−q) + 1) q
(41) for 1 < q < 3
(42)
for q = 1
(43)
for 1/2 < q < 1
(44)
which lead to g˜11 =
q(q + 1) . (3 − q)(2q − 1)σ 2
for 1/2 < q < 3
(45)
˜ is given by Similarly, the (2,2) component of G g˜22
∂ ln Pq (x) = Eq ∂σ 2 =
(q + 1) . 4(2q − 1)σ 4
7
!2 ,
for 1/2 < q < 3
(46) (47)
˜ is expressed by The modified Fisher information matrix G
˜= G
q(q+1) (3−q)(2q−1)σ2
0 (q+1) 4(2q−1)σ4
0
,
whose inverse is given by
˜ −1 = G
(3−q)(2q−1)σ2 q(q+1)
0 4(2q−1)σ4 (q+1)
0
.
A calculation of the (i, j) component (vij ) of the covariance error matrix V leads to σ2 0
V=
0 4σ4 (5−3q)
!
.
In the limit of q = 1, the matrices reduce to ˜ −1 = G−1 = G
V=
σ2 0 0 2σ 4
σ2 0 0 2σ 4
!
.
!
.
for q = 1
for q = 1
Chain and solid curves in Fig. 1(a) express the q dependences of v11 /σ 2 and 1/˜ g11 σ 2 , respectively. When q is further from unity, 1/˜ g11 is much decreased and it vanishes at q = 1/2 and 3. The lower bond of v11 is expressed by the Cram´er-Rao relation because it is satisfied by g˜11 : v11 =
1 1 ≥ . g11 g˜11
for 1/2 < q < 3
(48)
Chain, dashed and solid curves in Fig. 1(b) show v22 /σ 4 , 1/g22 σ 4 and 1/˜ g22 σ 4 , respectively. It is noted that v22 diverges at q = 5/3. The following relations hold: 1 1 > v22 > , g22 g˜22 1 1 ≥ . v22 ≥ g˜22 g22
for 1/2 < q < 1
(49)
for 1 ≤ q < 5/3
(50)
Equation (49) means that 1/g22 cannot provide the lower bound of v22 . Equations (48)(50) clearly show that the lower bound of V is expressed by the modified Fisher information ˜ [Eq. (35)], but not by the generalized Fisher information matrix G. matrix G
8
3
Coupled Langevin model
3.1
Adopted model
In order to discuss both the stationary and dynamical properties of the Tsallis entropy and generalized Fisher information matrix, we have adopted the N-unit coupled Langevin model subjected to additive and multiplicative white noise, given by
with Ii (t) =
dxi = F (xi ) + βξi (t) + αG(xi )ηi (t) + Ii (t), dt
(51)
X J [xj (t) − xi (t)] + I(t). (N − 1) j(6=i)
(52)
(i = 1 to N)
Here F (x) and G(x) denote arbitrary functions of x, J the coupling strength, I(t) an external input, α and β are the strengths of multiplicative and additive noise, respectively, and ηi (t) and ξi (t) express zero-mean Gaussian white noises with correlations given by hηi (t) ηj (t′ )i = δij δ(t − t′ ),
hξi(t) ξj (t′ )i = δij δ(t − t′ ),
hηi (t) ξj (t′ )i = 0.
(53) (54) (55)
We have adopted the mean-field approximation for Ii (t) given by ˆ Ii (t) ≃ J[µ(t) − xi (t)] + I(t),
(56)
with JN , (N − 1) 1 X µ(t) = Eq [xi (t)], N i Jˆ =
(57) (58)
where the Eq [·] expresses the average over the escort distribution to be shown below [Eqs. (61)-(63)].
3.2
Fokker-Planck equation
Owing to the adopted mean-field approximation given by Eq. (56), each element of the ensemble is ostensibly independent. The total probability distribution of P ({xk }, t) is
given by the product of that of each element:
P ({xk }, t) = Πi p(xi , t), 9
(59)
where the FPE for p(xi , t) in the Stratonovich representation is given by ∂ ∂ β2 p(xi , t) = − [F (xi ) + Ii (t)]p(xi , t) + ∂t ∂xi 2 ! 2 ∂ ∂ α G(xi ) G(xi )p(xi , t). + 2 ∂xi ∂xi
!
∂2 p(xi , t) ∂x2i (60)
The expectation value of µ(t) is determined in a self-consistent way by µ(t) =
Eq [xi (t)] ≡
Z
Pq (xi , t) xi (t) dxi ,
(61)
with the escort probability distribution: Pq (xi , t) = cq (t) =
1 p(xi , t)q , cq (t)
(62)
Z
(63)
p(xi , t)q dxi ,
where p(xi , t) depends on µ(t) through Ii (t) in Eqs. (56) and (60). The relevant fluctuation (variance) of σ(t)2 is given by σ(t)2 = Eq [(xi − µ)2 ] =
Z
Pq (xi , t) (xi − µ)2 dxi .
(64)
When we adopt F (x) and G(x) given by F (x) = −λ x,
(65)
G(x) = x,
(66)
where λ denotes the relaxation rate, the FPE for p(x, t) is expressed by (the subscript i is hereafter neglected) ∂ p(x, t) = ∂t +
3α2 ∂ α2 p(x, t) + λ + Jˆ + x − u(t) λ + Jˆ + p(x, t) 2 2 ∂x ! α2 2 β 2 ∂ 2 x + p(x, t), 2 2 ∂x2 !
"
!
#
(67)
with ˆ u(t) = Jµ(t) + I(t).
(68)
From the FPE given by Eq. (67), the stationary distribution is given by 2λ + 2Jˆ + α2 ln(α2 x2 + β 2 ) + Y (x), ln p(x) ∝ − 2 2α ! " !# 1 x2 ∝ − ln 1 + (q − 1) + Y (x), q−1 2φ2 !
10
(69) (70)
with 2α2 , (2λ + 2Jˆ + α2 ) β2 φ2 = , (2λ + 2Jˆ + α2 ) ! ! 2u −1 αx Y (x) = tan , αβ β ˆ + I, u = Jµ q = 1+
(71) (72) (73) (74)
where the entropic index is given for 1 ≤ q < 3. Equation (70) yields the q-Gaussian
distribution given by
x2 1 expq − 2 p(x) = Zq 2φ
!
eY (x) ,
(75)
eY (x) dx.
(76)
with Zq =
Z
expq
x2 − 2 2φ
!
The probability distribution given by Eq. (75) is different from that of Eq. (12) derived from the maximum-entropy method although both expressions are equivalent for µ = I = J = 0 with νσ 2 = φ2 . Some limiting cases of Eqs. (75) are examined in the following. (1) For α = 0 and β 6= 0 (i.e. additive noise only) 1 −(1/2σ2 )(x−µ)2 e , Z √1 = 2πσ,
p(x) = Z1
(77) (78)
where Eqs. (61) and (64) yield µ =
2φ2 u I β2 2 2 . = , σ = φ = ˆ β2 λ 2(λ + J)
(79)
(2) For α 6= 0, β = 0 and u 6= 0, (i.e. multiplicative noise only) [22, 23, 24], p(x) ∝ | x |−δ e−(κ/x) ,
(80)
with (2λ + 2Jˆ + α2 ) 2 = , δ = q−1 α2 ˆ + I) 2u 2(Jµ κ = = , α2 α2 11
(81) (82)
though the normalization factor Zq diverges. (3) For I = J = 0 (i.e. without coupling and external input) [22, 23, 24], 1 x2 p(x) = expq − 2 , Zq 2φ !
2φ2 q−1
Zq =
!1/2
(83) !
1 1 1 . , − B 2 q−1 2
(84)
Equation (83) leads to µ = 0, σ 2 =
β2 2φ2 = . (3 − q) 2λ
(85)
It is noted that when we adopt the q-Gaussian probability p(x) in place of the escort probability Pq (x) in Eq. (64), variance is given by σ 2 = β 2 /2(λ − α2 ) which diverges at
λ = α2 [24].
3.3
Tsallis entropy
By using the stationary distributions given by Eqs. (77) and (83) in Eq. (1), we get the analytic expression for the Tsallis entropy given by Sq =
1 [1 + ln(2πσ 2 )], for α = 0, β 6= 0 2 ! 1 − cq , for I = J = 0 q−1
=
(86) (87)
with cq
1 = Zqq
2φ2 q−1
!1/2
1 1 q , − B 2 q−1 2
!
3−q = 2
Zq1−q ,
(88)
where Zq is given by Eq. (84). With the use of the total distribution of P ({xi }) given by Eq. (59), the entropy of
N-unit ensemble is given by
Sq(N )
1 − cN q q−1
=
!
.
(89)
A simple calculation yields [25] Sq(N ) =
N X
k=1
CkN (−1)k−1 (q − 1)k−1 (Sq(1) )k ,
= NSq(1) −
N(N − 1) (q − 1)(Sq(1) )2 + ··, 2 12
(90) (91)
where CkN = N!/(N − k)! k! and Sq(1) denotes the entropy of a single element (N =
1). Equation (91) shows that the Tsallis entropy is generally non-extensive except for q = 1.0, for which Sq(N ) reduces to the extensive Boltzmann-Gibbs-Shannon entropy: (N )
S1
3.4
(1)
= N S1 .
Generalized Fisher information entropy
We consider the generalized Fisher information entropy given by [Eq. (27)] gq = q
Z
∂ ln p(x) p(x) ∂x
!2
dx = −q
Z
∂ 2 ln p(x) p(x) ∂x2
!
dx.
(92)
By using the stationary distributions given by Eqs. (77) and (83) in Eq. (4), we get the analytic expression for gq given by gq = =
ˆ 2(λ + J) , for α = 0, β 6= 0 β2 ! 1 B( 32 , q−1 + 12 ) 2q 2λ = . for I = J = 0 1 1 1 (q − 1)φ2 B( 2 , q−1 − 2 ) β2
1 σ2
=
(93) (94)
It is easy to see that the generalized Fisher information entropy is extensive in the nonextensive system [19] because Eqs. (4), (59) and (92) yield gq(N ) = Ngq(1) ,
(95)
where gq(N ) is the generalized Fisher information entropy for an N-unit system.
4
Model calculations
4.1
Stationary properties
The adopted Langevin model includes six parameters of λ α, β, J, I and N. Dependences of the Tsallis entropy and generalized Fisher information entropy on these parameters will be studied by numerical methods in this section. We have calculated the distribution p(x) with the FPE [Eqs. (75) and (76)], and also direct simulations (DSs) for the Langevin model [Eqs. (51) and (52)] with the Heun method: DS results are averages of 100 trials. Figures 2(a)-2(c) show three examples of the stationary distribution p(x) for (I, J) = (0.0, 0.0), (0.0, 0.5) and (0.5, 0.5) with λ = 1.0, α = 0.5, β = 0.5 and N = 100. Solid curves show the results calculated with the use of the FPE whereas dashed curves those of DSs for the Langevin equation: both results are in good agreement and indistinguishable. 13
When the coupling strength is increased from J = 0.0 to J = 0.5 with I = 0.0, the width of p(x) is decreased because of a decreased φ2 in Eq. (72). When an input of I = 0.5 is applied, p(x) changes its position by an amount of about 0.5 with a slight variation of its shape: p(x) for (I, J) = (0.5, 0.5) is not a simple translational shift of p(x) for (I, J) = (0.0, 0.5). 4.1.1
α dependence
First we show the α dependences of µ, σ 2 , Sq and gq , which are plotted in Figs. 3(a), 3(b), 3(c) and 3(d), respectively, for I = 0.0 (chain curves), I = 0.5 (dashed curves) and I = 1.0 (solid curves) with λ = 1.0, β = 0.5 and J = 0.0. Figure 3(a) shows that the α dependence of µ is very weak. We note in Fig. 3(b) that for I = 0.5 and I = 1.0, σ 2 is monotonously increased with increasing α though σ 2 is independent of α for I = 0.0. Figure 3(c) shows that with increasing α, Sq is increased with maxima at α ∼ 0.8 for
I = 1.0 and at α ∼ 1.2 for I = 0.5. In contrast, gq is decreased with minima at α ∼ 0.8 and α ∼ 0.9 for I = 1.0 and I = 0.5, respectively, with increasing α. For larger I, Sq and
gq have stronger α dependence. 4.1.2
β dependence
Figures 4(a), 4(b), 4(c) and 4(d) show the β dependences of µ, σ 2 , Sq and gq , respectively, for I = 0.0 (chain curves), I = 0.5 (dashed curves) and I = 1.0 (solid curves) with λ = 1.0, α = 0.5 and J = 0.0. With increasing β, µ has no changes although σ 2 has much increased for larger I. With increasing β, Sq is increased while gq is significantly decreased. This trend is more significant for I = 0.0 than for I = 0.5 and I = 1.0. 4.1.3
I dependence
The I dependences of µ, σ 2 , Sq and gq are shown in Figs. 5(a)-5(b) for α = 0.0 (chain curves), α = 0.5 (dashed curves) and α = 1.0 (solid curves) with λ = 1.0. The gradient of µ versus I is slightly larger for larger α. In the case of α = 0.0, σ 2 , Sq and gq are independent of I [see Eqs. (86) and (93)]. With increasing I for finite α, Sq is increased while gq is decreased. 4.1.4
J dependence
We show the J dependences of µ, σ 2 , Sq and gq in Figs. 6(a)-6(b), for I = 0.0 (chain curves), I = 0.5 (dashed curves) and I = 1.0 (solid curves) with λ = 1.0, α = 0.5, β = 0.5 14
and N = 100. We note that µ is independent of J. With increasing J, σ 2 and Sq are linearly decreased whereas gq is increased. 4.1.5
N dependence
Figure 7 shows the Tsallis entropy per element, Sq(N ) /N, for α = 0.0 (dotted curve), α = 0.1 (solid curve), α = 0.5 (dashed curve) and α = 1.0 (chain curve) with λ = 1.0, β = 0.5, I = 0.0 and J = 0.0. Note that for α = 0.0 (q = 1.0), the system is extensive because (N )
(1)
S1 /N = S1 . For finite α, however, it is nonextensive: Sq(N ) /N is more significantly decreased for larger α [Eq.(91)], though the generalized Fisher information entropy gq is extensive [Eq. (95)].
4.2 4.2.1
Dynamical properties Partial difference equation method
In order to discuss the dynamical properties of the entropies, we have to calculate the time-dependent probability p(x, t), solving FPE given by Eq. (67). In the case of q = 1.0, we may obtain the time-dependent distribution given by 1 2 2 p(x, t) = q e−[x−µ(t)] /2σ(t) , 2π σ(t)2
(96)
where equations of motion for µ(t) and σ(t)2 for J = 0 are given by dµ(t) = −λµ(t) + I, dt dσ(t)2 = −2λσ(t)2 + β 2 . dt
(97) (98)
In the case of q 6= 1.0, we have once tried to obtain an analytic solution of the time-
dependent distribution p(x, t), assuming that it is given by [cf. Eq. (12)]
Aq
exp p(x, t) = q q 2 σ(t)
"
(x − µ(t))2 − , 2νσ(t)2 #
(99)
where the q-dependent coefficient Aq is obtainable from Eqs. (15)-(17). Although equations of motion for µ(t) and σ 2 (t) may be derived as to satisfy the FPE after Ref. [26], they are not uniquely determined: two equations for dµ(t)/dt and three equations for dσ(t)2 /dt are mutually not consistent. This implies that the analytic solution of the FPE is not given by Eq. (99). 15
Since we could not succeed in obtaining an analytic solution for p(x, t), we have adopted a numerical method, using the difference equation derived from Eq. (67), as given by α2 p(x, t + b) = p(x, t) + λ + J˜ + b p(x, t) 2 ! # ! " 2 b 3α − u(t) [p(x + a) − p(x − a)] + x λ + J˜ + 2 2a ! ! α2 2 β 2 b + [p(x + a, t) + p(x − a, t) − 2p(x, t)], x + 2 2 a2 !
(100)
with u(t) = J˜ µ(t) + I(t),
(101)
where a and b denote incremental steps of x and t, respectively. We impose the boundary condition: p(x, t) = 0,
for | x |≥ xm
(102)
with xm = 5, and the initial condition of p(x, 0) = p0 (x) where p0 (x) is the stationary distribution given by Eqs. (75) and (76). We have chosen parameters of a = 0.05 and b = 0.0001 such as to satisfy the condition: (α2 x2m b/2a2 ) < 1/2, which is required for stable, convergent solutions of the partial difference equation. 4.2.2
Response to I(t)
We apply the pulse input given by I(t) = ∆I Θ(t − 2)Θ(6 − t),
(103)
where ∆I = 1.0 and Θ(t) denotes the Heaviside function: Θ(t) = 1 for t > 0 and zero otherwise. Figure 8 shows the time-dependent distribution at various t for λ = 1.0, α = 0.5, β = 0.5 and J = 0.0. When input of ∆I is added at t = 2.0, the distribution is gradually changed, moving rightward. This change in p(x, t) induces changes in µ(t), σ(t)2 , Sq (t) and gq (t), whose time dependences are shown in Figs. 9(a) and 9(b). By an applied pulse input, µ, σ 2 and Sq are increased while gq is decreased. For a comparison, we show by dashed curves, the results when the step input given by I(t) = ∆I Θ(t − 2), is applied. The relaxation time of Sq and gq is about 2.0. 16
(104)
4.2.3
Response to λ(t)
We modify the relaxation rate as given by λ = 1.0 + 0.5 Θ(t − 2)Θ(6 − t),
(105)
which expresses an application of an external force of −∆λ x at 2 ≤ t < 6. Figure 10(a) and 10(b) show the time dependences of σ 2 , Sq and gq with α = 0.5, β = 0.5, I = 0.0 and J = 0.0: for which µ = 0. When an external force is applied, σ 2 and Sq are decreased whereas gq is increased. The relaxation times of Sq and gq are 0.47 and 0.53, respectively. 4.2.4
Response to α(t)
The magnitude of multiplicative noise is changed as given by α(t)2 = 0.25[1.0 + Θ(t − 2)Θ(6 − t)].
(106)
Figure 11(a) and 11(b) show the time dependence of σ 2 , Sq and gq with λ = 1.0, β = 0.5, I = 0.0 and J = 0.0. When α is increased, Sq and σ 2 are increased whereas gq is decreased. The relaxation times of Sq and gq are 0.83 and 0.92, respectively. 4.2.5
Response to β(t)
We change the magnitude of additive noise as given by β(t)2 = 0.25[1.0 + Θ(t − 2)Θ(6 − t)].
(107)
Figure 12(a) and 12(b) show the time dependence of σ 2 , Sq and gq with λ = 1.0, α = 0.5, I = 0.0 and J = 0.0. When β is increased, σ 2 and Sq are increased whereas gq is decreased. The relaxation times of Sq and gq are 0.49 and 0.34, respectively.
5 5.1
Discussion Entropy flux and entropy production
It is interesting to discuss the entropy flux and entropy production from the time derivative of the Tsallis entropy given by !Z
∂p(x, t) q dSq (t) p(x, t)q−1 = − dt q−1 ∂t = QF + QA + QM , 17
!
dx,
(108) (109)
with QF = q
Z
QA =
α2 q 2
QM =
β2 2
p(x, t)q !Z
!Z
dF (x) dx
p(x, t)q
!
dx + q(q − 1)
∂ ln p(x, t) ∂x
!2
∂ ln p(x, t) p(x, t)q q ∂x
Z
p(x, t)q
!
∂ ln p(x, t) F (x) dx, ∂x
(110)
dx,
!2
(111) !2
dG(x) G(x)2 − dx
d2 G(x) G(x) dx. − dx2
(112)
Here QF denotes the entropy flux, and QA and QM stand for entropy productions due to additive and multiplicative noise, respectively. By using the stationary distribution given by Eq. (83), we get QF , QA and QM in the stationary state with I = J = 0 (i.e. without couplings and external input): QF
2σ 2 q−1
λq = − q Zq
λq(q − 1) + σ 2 Zqq QA
β2 q = 2σ 4 Zqq
QM
α2 q = 2σ 4 Zqq α2 − 2Zqq
!1/2
2σ 2 q−1
2σ 2 q−1 2σ 2 q−1
2σ 2 q−1
1 1 1 , + B 2 q−1 2 !3/2
!3/2 !5/2
!1/2
! !
1 3 1 , , + B 2 q−1 2 !
3 3 1 , , + B 2 q−1 2 1 5 1 , + B 2 q−1 2
(113) (114)
!
!
1 1 1 , , + B 2 q−1 2
(115)
where Zq is given by Eq. (84). Equations (113)-(115) satisfy the stationary condition: QF + QA + QM = 0. It is worthwhile to examine the limit of α → 0 (q → 1.0), in which Eqs. (108),
(113)-(115) yield
∂p(x, t) dS1 (t) = − ln p(x, t) dx, dt ∂t = QF + QA , Z
(116) (117)
with QF = QA =
Z
dF (x) p(x, t) dx α2 2
!Z
!
dx,
∂ ln p(x, t) p(x, t) ∂x 18
(118) !2
dx.
(119)
With noticing the relation: lim|z|→∞[Γ(z + a)/Γ(z)z a ] = 1 [27], we may see that Eqs. (118) and (119) lead to QF = −QA = −λ and dS1 /dt = 0 in the limit of q → 1.
In the opposite limit of β → 0, Eqs. (113)-(115) yields that each of QF . QA and QM
is proportional to 1/β (q−1) and then divergent in this limit, though QF + QA + QM = 0. It is noted that QA = λ for α → 0 and β → 0 [2, 3, 4].
We present some calculated results of QF , QA and QM in the stationary state, which
are shown in Fig. 13 as a function of α for β = 0.1 (dashed curves), β = 0.5 (chain curves) and β = 1.0 (solid curves). We note that QF < 0 and QA + QM > 0. With increasing α, QF is decreased in the case of β = 0.1, while it is increased in the cases of β = 0.5 and 1.0. Bag [4] showed that QF is always decreased with increasing α, which disagrees with our result mentioned above: Eqs. (36) and (37) in Ref. [4] are rather different from our Eqs. (113)-(115).
5.2
Correlation among Sq , gq and σ 2
From Figs. 3-6, we expect that there might be some correlations among Sq , gq and σ 2 . Indeed, Eqs. (86) and (93) show that Sq , gq and σ 2 for α = 0 (q = 1.0) are given by Sq = c ln σ 2 + d, 1 , gq = σ2
(120) for α = 0
(121)
where c = 1/2 and d = (1/2)[1 + ln(2π)]. In order to study the correlation among Sq , gq and σ 2 for an arbitrary α, we plot in Fig. 14(a), Sq and 1/gq as a function of σ 2 when α is varied for I = 0.0 (chain curve), I = 0.5 (dashed curve) and I = 1.0 (solid curve) with β = 0.5 and J = 0.0, which we have discussed in Sec. 4.1.1 (see Fig. 3). Similarly, Figure 14(b) shows Sq and 1/gq as a function of σ 2 when β is varied for I = 0.0 (chain curve), I = 0.5 (dashed curve) and I = 1.0 (sold curve) with α = 0.5 and J = 0.0, which have been discussed in Sec. 4.1.2 (see Fig. 4). We note in Fig. 14(b) that Sq and gq nearly follow dotted lines showing the relations given by Eqs. (120) and (121). In contrast, Sq and 1/gq in Fig. 14(a) show peculiar σ 2 dependences, which arise from the fact that σ 2 , Sq and gq have intrigue α dependences as shown in Fig. 3. Multiplicative noise significantly modified the σ 2 dependences of Sq and 1/gq for I 6= 0. Sq and gq may
be not represented by simple functions of σ 2 . Figures 14(a) and 14(b) show that with
increasing σ 2 , Sq is increased while gq is decreased. This behavior is realized also in the time-dependent responses of σ(t)2 , Sq (t) and gq (t) shown in Figs. 9-12. 19
6
Conclusion
Properties of the Tsallis entropy (Sq ) and generalized Fisher information entropy (gq ) have been discussed with the use of two types of probability distributions: (a) the distribution derived by the maximum-entropy method and (b) that obtained by the FPE for the coupled Langevin model subjected to multiplicative noise, which is a typical nonextensive system. There are difference and similarity in properties of Sq and gq for the two distributions: (i) with increasing σ 2 , Sq is increased while gq is decreased for the both cases, and (ii) Sq and gq is independent of µ in the case (a) while they depend on µ for the case (b). We have demonstrated for the case (a) that (iii) the Cram´er-Rao inequality for q 6= 1.0 is expressed in terms of the modified Fisher
information matrix, not by the generalized Fisher information matrix. Detailed study on the case (b) has shown the followings:
(iv) with increasing J, σ 2 and Sq are decreased while gq is increased (Fig. 6), (v) with increasing N, Sq per element is generally decreased (nonextensive) whereas gq is extensive (Fig. 7), (vi) the σ 2 dependences of Sq and gq are significantly modified by multiplicative noise with I 6= 0 (Fig. 14), and
(vii) the relaxation times of Sq and gq for transient changes in λ, α and β are short (τ ∼ 0.5 − 0.9) while those in I are fairly long (τ ∼ 2).
The difference between the σ 2 dependences of Sq and gq in the items (i) and (v) arises
from the fact that Sq provides us with a global measure of ignorance while gq a local measure of positive amount of information [1]. The difference between the two cases of the item (ii) is due to the difference in the probability distributions: p(x) of the case (a) [Eq. (12)] has the translational symmetry with respect to µ, while that of the case (b) [Eq. (75)] does not. Items (iv)-(vii) clarify the interplay among λ, α, β, I, J and N for Sq and gq in the coupled Langevin model of the case (b). It would be interesting to apply the present approach to the neuronal-network model [28, 29], which is left for our future study.
20
Acknowledgments This work is partly supported by a Grant-in-Aid for Scientific Research from the Japanese Ministry of Education, Culture, Sports, Science and Technology.
21
References [1] B. R. Frieden, Physics from Fisher information: a unification (Cambridge Univ. Press, Cambridge, 1998). [2] D. Daems and G. Nicolis, Phys. Rev. E 59, 4000 (1999). [3] B. C. Bag, S. K. Banik, and D. S. Ray, Phys. Rev. E 64, 026110 (2001). [4] B. C. Bag, Phys. Rev. E 66, 026122 (2002). [5] D. O. Gonzaler, M. Mayorga. J. Orozco, and L. R. Salazar, J. Chem. Phys. 118, 6989 (2003). [6] B. Q. Ai, X. J. Wang, G. T. Liu, and L.G. Liu, Phys. Rev. E 67, 022903 (2003). [7] G. Goswami, B. Mukherjee, and B. C. Bag, Chem. Phys. 312, 47 (2005). [8] S. Amari and H. Nagaoka, Methods of Information Geometry, (AMS and Oxford Univeristy press, 2000). [9] C. Tsallis: J. Stat. Phys. 52, 479 (1988). [10] C. Tsallis, R. S. Mendes, and A. R. Plastino: Physica A 261, 534 (1998). [11] C. Tsallis: Physica D 193, 3 (2004). [12] Lists of many applications of the nonextensive statistics are available at URL: (http://tsallis.cat.cbpf.br/biblio.htm) [13] S. Abe, Phys. Rev. E 68, 031101 (2003). [14] J. Naudts, J. Ineq. Pure Appl. Math. 5, 102 (2004). [15] F. Pennini and A. Plastino, Physica A 334, 132 (2004). [16] M. Portesi, A. Plastino, and F. Pennini, Physica A 365, 173 (2006). [17] M. Portesi, F. Pennini and A. Plastino, Physica A 373, 273 (2007). [18] M. Masi, arXiv:cond-mat/0611300. [19] Hiroshi Hasegawa, Prog. Theor. Phys. Suppl. 162, 183 (2006). 22
[20] A. Ohara, Phys. Lett. A 370, 184 (2007). [21] H. Hasegawa, Physica A 365, 383 (2006). [22] H. Sakaguchi: J. Phys. Soc. Jpn. 70, 3247 (2001). [23] C. Anteneodo and C. Tsallis: J. Math. Phys. 44, 5194 (2003). [24] H. Hasegawa, Physica A 374, 585 (2007). [25] Eliminating cq from Sq(1) = (1 − cq )/(q − 1) and Sq(N ) = (1 − cN q )/(q − 1), we get (q − 1)Sq(N ) = 1 − [1 − (q − 1)Sq(1) ]N , which leads to Eq. (90).
[26] A. R. Plastino, M. Casas, and A. Plastino, Physica A 280, 289 (2000). [27] M. Abramowitz and I. A. Stegun, Handbook of Mathematical Functions, (Dover, New York, 1972). [28] H. Hasegawa, Phys. Rev E 75, 051904 (2007). [29] H. Hasegawa, in Neuronal Network Research Horizons, edited by M. L. Weiss (Nova Science Publishers, New York, 2007), pp 61.
23
Figure 1: The q dependences of (a) v11 /σ 2 (= 1/g11 σ 2 )(chain curve) and 1/˜ g11 σ 2 (solid curve), and (b) v22 /σ 4 (chain curve), 1/˜ g22 σ 4 (solid curve) and 1/g22 σ 4 (dashed curve): gij and g˜ij are elements of the generalized and modified Fisher information matrices, respectively. Figure 2: (Color online) Stationary distribution p(x) for (I, J) = (0.0, 0.0), (0.0, 0.5) and (0.5, 0.5) with λ = 1.0, α = 0.5 and β = 0.5, calculated by the FPE [Eqs. (75) and (76)] (solid curves) and by direct simulations (DSs) for the coupled Langevin model [Eqs. (51) and (52)] (dashed curves). Figure 3: The α dependences of (a) µ, (b) σ 2 , (c) Sq and (d) gq for I = 0.0 (chain curves), I = 0.5 (dashed curves) and I = 1.0 (solid curves) with λ = 1.0, β = 0.5 and J = 0.0. Figure 4: The β dependences of (a) µ, (b) σ 2 , (c) Sq and (d) gq for I = 0.0 (chain curves), I = 0.5 (dashed curves) and I = 1.0 (solid curves) with λ = 1.0, α = 0.5 and J = 0.0. Figure 5: The I dependences of (a) µ, (b) σ 2 , (c) Sq and (d) gq for α = 0.0 (chain curves), α = 0.5 (dashed curves) and α = 1.0 (solid curves) with λ = 1.0, β = 0.5 and J = 0.0. Figure 6: The J dependences of (a) µ, (b) σ 2 , (c) Sq and (d) gq for I = 0.0 (chain curves), I = 0.5 (dashed curves) and I = 1.0 (solid curves) with λ = 1.0, α = 0.5, β = 0.5 and N = 100. Figure 7: The N dependence of the Tsallis entropy per element, Sq(N ) /N, for α = 0.0 (dotted curve), α = 0.1 (solid curve), α = 0.5 (dashed curve) and α = 1.0 (chain curve) with λ = 1.0, β = 0.5, I = 0.0 and J = 0.0. Figure 8: The time-dependent probability distribution p(x, t) when an input pulse given by I(t) = ∆I Θ(t − 2)Θ(6 − t) is applied with ∆I = 1.0, λ = 1.0, α = 0.5 and β = 0.5: curves are consecutively shifted downward by 0.25 for a clarity of the figure. Figure 9: The time dependence of (a) µ(t) and σ(t)2 and (b) Sq (t) and gq (t) for inputs of I(t) = ∆I Θ(t − 2)Θ(6 − t) (solid curves) and I(t) = ∆I Θ(t − 2) (dashed curves) shown at the bottom of (a): ∆I = 1.0 λ = 1.0, α = 0.5, β = 0.5 and J = 0.0, results of gq and µ being divided by a factor of ten. Figure 10: The time dependence of (a) σ(t)2 and (b) Sq (t) and gq (t) for λ(t) = 1.0 + 0.5 Θ(t − 2)Θ(6 − t) shown at the bottom of (a): λ = 1.0, α = 0.5, β = 0.5, I = 0.0 and J = 0.0, results of gq being divided by a factor of ten. 24
Figure 11: The time dependence of (a) σ(t)2 and (b) Sq (t) and gq (t) for α(t)2 = 0.25[1.0+ Θ(t − 2)Θ(6 − t)] shown at the bottom of (a): λ = 1.0, β = 0.5, I = 0.0 and J = 0.0, results of gq being divided by a factor of ten.
Figure 12: The time dependence of (a) σ(t)2 and (b) Sq (t) and gq (t) for β(t)2 = 0.25[1.0+ Θ(t − 2)Θ(6 − t)] shown at the bottom of (a): λ = 1.0, α = 0.5, I = 0.0 and J = 0.0, results of gq being divided by a factor of ten.
Figure 13: (Color online) The α dependence of entropy flux (QF ), and entropy productions by additive noise (QA ) and multiplicative noise (QM ) for β = 0.1 (dashed curves), β = 0.5 (chain curves) and β = 1.0 (solid curves) with λ = 1.0, I = 0.0 and J = 0.0.
Figure 14: (Color online) Sq and 1/gq as a function of σ 2 (a) for changing α with λ = 1.0, β = 0.5 and J = 0.0, and (b) for changing β with λ = 1.0, α = 0.5 and J = 0.0: I = 0.0 (chain curves), I = 0.5 (dashed curves) and I = 1.0 (solid curves); dotted lines denote the relations given by Eqs. (120) and (121) (see text).
25
This figure "fig1.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2
This figure "fig2.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2
This figure "fig3.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2
This figure "fig4.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2
This figure "fig5.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2
This figure "fig6.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2
This figure "fig7.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2
This figure "fig8.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2
This figure "fig9.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2
This figure "fig10.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2
This figure "fig11.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2
This figure "fig12.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2
This figure "fig13.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2
This figure "fig14.gif" is available in "gif" format from: http://arXiv.org/ps/0711.3923v2