TY - CHAP

T1 - Gauge freedom of entropies on q-Gaussian measures

AU - Matsuzoe, Hiroshi

AU - Takatsu, Asuka

N1 - Funding Information:
Acknowledgements The both authors were supported in part by JSPS Grant-in-Aid for Scientific Research (KAKENHI) 16KT0132. HM was supported in part by KAKENHI 19K03489. AT was supported in part by KAKENHI 19K03494, 19H01786.
Publisher Copyright:
© Springer Nature Switzerland AG 2021.

PY - 2021

Y1 - 2021

N2 - A q-Gaussian measure is a generalization of a Gaussian measure. This generalization is obtained by replacing the exponential function with the power function of exponent 1 / (1 - q) (q≠ 1). The limit case q= 1 recovers a Gaussian measure. On the set of all q-Gaussian densities over the real line with 1 ≤ q< 3, escort expectations determine information geometric structures such as an entropy and a relative entropy. The ordinary expectation of a random variable is the integral of the random variable with respect to its law. Escort expectations admit us to replace the law by any other measures. One of the most important escort expectations on the set of all q-Gaussian densities is the q-escort expectation since this escort expectation determines the Tsallis entropy and the Tsallis relative entropy. The phenomenon gauge freedom of entropies is that different escort expectations determine the same entropy, but different relative entropies. In this chapter, we first introduce a refinement of the q-logarithmic function. Then we demonstrate the phenomenon on an open set of all q-Gaussian densities over the real line by using the refined q-logarithmic functions. We write down the corresponding Riemannian metric.

AB - A q-Gaussian measure is a generalization of a Gaussian measure. This generalization is obtained by replacing the exponential function with the power function of exponent 1 / (1 - q) (q≠ 1). The limit case q= 1 recovers a Gaussian measure. On the set of all q-Gaussian densities over the real line with 1 ≤ q< 3, escort expectations determine information geometric structures such as an entropy and a relative entropy. The ordinary expectation of a random variable is the integral of the random variable with respect to its law. Escort expectations admit us to replace the law by any other measures. One of the most important escort expectations on the set of all q-Gaussian densities is the q-escort expectation since this escort expectation determines the Tsallis entropy and the Tsallis relative entropy. The phenomenon gauge freedom of entropies is that different escort expectations determine the same entropy, but different relative entropies. In this chapter, we first introduce a refinement of the q-logarithmic function. Then we demonstrate the phenomenon on an open set of all q-Gaussian densities over the real line by using the refined q-logarithmic functions. We write down the corresponding Riemannian metric.

KW - Gauge freedom of entropies

KW - Information geometry

KW - Refined q-logarithmic function

KW - q-Gaussian measure

UR - http://www.scopus.com/inward/record.url?scp=85102766534&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85102766534&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-65459-7_6

DO - 10.1007/978-3-030-65459-7_6

M3 - Chapter

AN - SCOPUS:85102766534

T3 - Signals and Communication Technology

SP - 127

EP - 152

BT - Signals and Communication Technology

PB - Springer Science and Business Media Deutschland GmbH

ER -