×

zbMATH — the first resource for mathematics

Hyperstability of a functional equation. (English) Zbl 1212.39044
The main result of the paper is the following. Let \(\alpha,\varepsilon \in\mathbb{R}\) be fixed, \(\alpha<0\) and \(\varepsilon\geq 0\). Then, the function \(f:\,]0,1[\,\to\mathbb{R}\) satisfies the inequality \[ \left| f(x)+(1-x)^{\alpha}f\left(\frac{y}{1-x}\right)-f(y)-(1-y)^{\alpha} f\left(\frac{x}{1-y}\right)\right| \leq \varepsilon \tag{1} \] for all \((x,y)\in D\doteq \{(x,y)\in\mathbb{R}^{2}:x,y,x+y\in \,]0,1[\,\}\) if, and only if, there exist \(a,b \in\mathbb{R}\) such that \[ f(x)= ax^{\alpha}+b((1-x)^{\alpha}-1). \qquad\qquad (x\in \,]0,1[\,) \] This result has the somewhat surprising consequence that the parametric fundamental equation of information (that can be obtained from \((1)\) with \(\varepsilon=0\) and plays an important role in characterizing information measures) is hyperstable, that is, the solutions of the stability inequality \((1)\) and the parametric fundamental equation of information are the same. As a corollary, it is also proved that the system of equations that defines \(\alpha\)-recursive and semi-symmetric information measures is stable.

MSC:
39B82 Stability, separation, extension, and related topics for functional equations
94A17 Measures of information, entropy
PDF BibTeX XML Cite
Full Text: DOI arXiv
References:
[1] J. Aczél and Z. Daróczy, On Measures of Information and their Characterization, Academic Press (New York – San Francisco – London, 1975).
[2] Z. Daróczy, Generalized information functions, Information and Control, 16 (1970), 36–51. · Zbl 0205.46901 · doi:10.1016/S0019-9958(70)80040-7
[3] B. R. Ebanks, P. Sahoo and W. Sander, Characterizations of Information Measures, World Scientific Publishing Co., Inc. (River Edge, NJ, 1998). · Zbl 0923.94002
[4] Z. L. Forti, Hyers-Ulam stability of functional equations in several variables, Aequationes Math., 50 (1995), 143–190. · Zbl 0836.39007 · doi:10.1007/BF01831117
[5] E. Gselmann, Recent results on the stability of the parametric fundamental equation of information, to appear in Acta Math. Acad. Paedagog. Nyházi. · Zbl 1224.39043
[6] E. Gselmann and Gy. Maksa, Stability of the parmetric fundamental equation of information for nonpositive parameters, to appear in Aequationes Math.
[7] J. Havrda and F. Charvát, Quantification method of classification processes, concept of structural \(\alpha\)-entropy, Kybernetika, 3 (1967), 30–35. · Zbl 0178.22401
[8] Gy. Maksa, Solution on the open triangle of the generalized fundamental equation of information with four unknown functions, Utilitas Math., 21 (1982), 267–282. · Zbl 0497.94003
[9] Gy. Maksa, The stability of the entropy of degree alpha, J. Math. Anal. Appl. 346 (2008), 17–21. · Zbl 1149.39024 · doi:10.1016/j.jmaa.2008.05.034
[10] Gy. Maksa and Zs. Páles, Hyperstability of a class of linear functional equations, Acta Math. Acad. Paedagog. Nyházi, 17 (2001), 107–112.
[11] Z. Moszner, Sur les définitions différentes de la stabilité des équations fonctionnelles, Aequationes Math., 68 (2004), 260–274. · Zbl 1060.39031 · doi:10.1007/s00010-004-2749-3
[12] L. Székelyhidi, Problem 38 (in Report of Meeting), Aequationes Math., 41 (1991), 302.
[13] C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, J. of Statistical Physics, 52 (1988), 479–487. · Zbl 1082.82501 · doi:10.1007/BF01016429
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.