Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
…
12 pages
1 file
Many information measures are suggested in the literature. Among these measures are the Shannon H,(O) and the Awad A,(O) entropies. In this work we suggest a new entropy measure, B~(0), which is based on the maximum likelihood function. These three entropies were calculated from the gamma distribution and its normal approximation, the binomial and its Poisson approximation, and the Poisson and its normal approximation. The relative losses in these three entropies are used as a criterion for the appropriateness of the approximation. Copyright © 1996 Elsevier Science Ltd.
Physica A: Statistical Mechanics and its Applications, 2017
Highlights 1. Two estimation methods of Fisher Information Measure (FIM) and Shannon entropy (SE) are analysed. 2. One is based on discretizing FIM and SE formulae; the other on the kernel-based estimation of the probability density function. 3. FIM (SE) estimated by using the discrete-based approach is approximately constant with , but decreases (increases) with the bin number L. 4. FIM (SE) estimated by using the kernel-based approach is very close to the theoretic value for any .
2000
This memo,contains proofs that the Shannon entropy is the limiting case of both the Renyi entropy and the information. These results are also confirmed experimentally. We conclude with some general observations on the utility of entropy measures.
Estimating the entropy of finite strings has applications in areas such as event detection, similarity mea-surement or in the performance assessment of compression algorithms. This report compares a variety of computable information measures for finite strings that may be used in entropy estimation. These include Shannon's n-block entropy, the three variants of the Lempel-Ziv production complexity, and the lesser known T-entropy. We apply these measures to strings derived from the logistic map, for which Pesin's identity allows us to deduce corresponding Shannon entropies (Kolmogorov-Sinai entropies) without resorting to probabilistic methods.
International Statistical Review, 2020
SummaryStarting from the pioneering works of Shannon and Weiner in 1948, a plethora of works have been reported on entropy in different directions. Entropy‐related review work in the direction of statistical inference, to the best of our knowledge, has not been reported so far. Here, we have tried to collect all possible works in this direction during the last seven decades so that people interested in entropy, specially the new researchers, get benefited.
1998
It is claimed that the way the cogentropy defines the probabilities of events, in a certain data set, assuming that only a possibly non-representative subset of the entire set is known, provides better estimates than using those conventionally applied in the Shannon entropy 113. In this paper, we apply both entropy measures to estimate the expected uncertainty in the data when the whole data set is known. This expected uncertainty value is nothing but the Shannon entropy when the entire set is known. The effect of the size of the subset on the approximations computed by both entropy measures is investigated. The study takes into consideration the conditional and unconditional cases.
arXiv: Other Statistics, 2019
Starting from the pioneering works of Shannon and Weiner in 1948, a plethora of works have been reported on entropy in different directions. Entropy-related review work in the direction of statistics, reliability and information science, to the best of our knowledge, has not been reported so far. Here we have tried to collect all possible works in this direction during the period 1948-2018 so that people interested in entropy, specially the new researchers, get benefited.
h i g h l i g h t s • Two estimation methods (discretization and kernel-based approach) are applied to FIM and SE. • FIM (SE) estimated by discrete approach is nearly constant with σ. • FIM (SE) estimated by discrete approach decreases (increases) with the bin number. • FIM (SE) estimated by kernel-based approach is close to the theory value for any σ. a b s t r a c t The performance of two estimators of Fisher Information Measure (FIM) and Shannon entropy (SE), one based on the discretization of the FIM and SE formulae (discrete-based approach) and the other based on the kernel-based estimation of the probability density function (pdf) (kernel-based approach) is investigated. The two approaches are employed to estimate the FIM and SE of Gaussian processes (with different values of σ and size N), whose theoretic FIM and SE depend on the standard deviation σ. The FIM (SE) estimated by using the discrete-based approach is approximately constant with σ , but decreases (increases) with the bin number L; in particular, the discrete-based approach furnishes a rather correct estimation of FIM (SE) for L ∝ σ. Furthermore, for small values of σ , the larger the size N of the series, the smaller the mean relative error; while for large values of σ , the larger the size N of the series, the larger the mean relative error. The FIM (SE) estimated by using the kernel-based approach is very close to the theoretic value for any σ , and the mean relative error decreases with the increase of the length of the series. Comparing the results obtained using the discrete-based and kernel-based approaches, the estimates of FIM and SE by using the kernel-based approach are much closer to the theoretic values for any σ and any N and have to be preferred to the discrete-based estimates.
Entropy
This study attempts to extend the prevailing definition of informational entropy, where entropy relates to the amount of reduction of uncertainty or, indirectly, to the amount of information gained through measurements of a random variable. The approach adopted herein describes informational entropy not as an absolute measure of information, but as a measure of the variation of information. This makes it possible to obtain a single value for informational entropy, instead of several values that vary with the selection of the discretizing interval, when discrete probabilities of hydrological events are estimated through relative class frequencies and discretizing intervals. Furthermore, the present work introduces confidence limits for the informational entropy function, which facilitates a comparison between the uncertainties of various hydrological processes with different scales of magnitude and different probability structures. The work addresses hydrologists and environmental engineers more than it does mathematicians and statisticians. In particular, it is intended to help solve information-related problems in hydrological monitoring design and assessment. This paper first considers the selection of probability distributions of best fit to hydrological data, using generated synthetic time series. Next, it attempts to assess hydrometric monitoring duration in a netwrok, this time using observed runoff data series. In both applications, it focuses, basically, on the theoretical background for the extended definition of informational entropy. The methodology is shown to give valid results in each case.
Entropy, 2004
Entropy has been the main tool in the analysis of the concept of information since information theory was conceived in the work of Shannon more than fifty years ago. There were some attempts to find more general measure of information, but their outcomes were more of formal, theoretical interest, and neither has provided better insight into the nature of information. The strengths of entropy seemed so obvious that no much effort has been made to find an alternative to entropy which gives different values, but which is consistent with entropy in the sense that the results obtained in information theory thus far can be reproduced with the new measure. In this article the need for such an alternative measure is demonstrated based on historical review of the problems with conceptualization of information. Then, an alternative measure is presented in the context of modified definition of information applicable outside of the conduit metaphor of Shannon's approach, and formulated without reference to uncertainty. It has several features superior to those of entropy. For instance, unlike entropy it can be easily and consistently extended to the continuous probability distributions, and unlike differential entropy this extension is always positive and invariant with respect to linear transformations of coordinates.
(Cagliari, 30 settembre - 3 ottobre 2024) Università degli Studi di Cagliari - Facoltà di Scienze Economiche, Giuridiche e Politiche - Mediatori Mediterranei ONLUS - Fondazione di Sardegna, 2024
Israel Studies , 2024
Baltic Journal of Modern Computing, 2016
Theories of Female Criminality: A criminological analysis, 2014
Сборник тезисов конференции "Человек и лекарство". Приложение к журналу "Кардиоваскулярная терапия и профилактика" том 23, № 6, 2024
Türk Eskiçağ Bilimleri Enstitüsü-Haberler, 2021
BCH supplement 57, οβολος 10, Το νόμισμα στην Πελοπόννησο, Proceedings of the sixth scientific meeting of the Friends of the Numismatic Museum, Argos May 26-29 May 2011, τόμος Β, 2018
Oud Holland - Quarterly for Dutch Art History, 2017
Gastroenterology, 2017
SOCIETY. INTEGRATION. EDUCATION. Proceedings of the International Scientific Conference
Polar Biology, 2021
Procedia Computer Science, 2018
Journal of Agricultural Science
Journal of Nutritional Health & Food Science, 2017