Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2000, Journal of The Franklin Institute-engineering and Applied Mathematics
…
7 pages
1 file
This paper describes a method to randomly generate vectors of symbol probabilities so that the corresponding discrete memoryless source has a prescribed entropy. One application is to Monte Carlo simulation of the performance of noiseless variable length source coding.
IEEE Transactions on Information Theory, 1998
A random number generator generates fair coin flips by processing deterministically an arbitrary source of nonideal randomness. An optimal random number generator generates asymptotically fair coin flips from a stationary ergodic source at a rate of bits per source symbol equal to the entropy rate of the source. Since optimal noiseless data compression codes produce incompressible outputs, it is natural to investigate their capabilities as optimal random number generators. In this paper we show under general conditions that optimal variable-length source codes asymptotically achieve optimal variable-length random bit generation in a rather strong sense. In particular, we show in what sense the Lempel-Ziv algorithm can be considered an optimal universal random bit generator from arbitrary stationary ergodic random sources with unknown distributions.
2006
We prove a coding theorem for the class of variable-to-fixed length codes and memory less processes using a generalized version of average word length and Rényi's entropy. Further, a generalized version of Tunstall's algorithm is introduced and its optimality is proved.
IEEE Transactions on Information Theory, 1973
IEEE Transactions on Communications, 2000
This paper describes a family of codes for entropy coding of memoryless sources. These codes are defined by sets of production rules of the form
IEEE Transactions on Information Theory, 1999
The problem of coding low-entropy information sources is considered. Since the run-length code was offered about 50 years ago by Shannon, it is known that for such sources there exist coding methods much simpler than for sources of a general type. However, known coding methods of low-entropy sources do not reach the given redundancy. In this correspondence, a new method of coding low-entropy sources is offered. It permits a given redundancy r with almost the same encoder and decoder memory size as that obtained by Ryabko for general methods, while encoding and decoding much faster.
Variable-length codes can provide compression for data communication. Such codes may be used not only when the source statistics is known but also when we do not know the source probability distribution, and a source with equal symbol probabilities (equiprobable symbols) can or has to be assumed. This paper presents variable-length codes with code words that differ in length by at most one code symbol. Such codes suit the efficient encoding of sources with equiprobable symbols. We accommodate non-binary codes and present an iterative algorithm for the construction of such codes. We also calculate the average codeword length for such codes, which extends Krichevski's result for binary codes [5]. Finally, we propose a scheme that allows the code to be communicated efficiently from transmitter to receiver.
2020 IEEE International Symposium on Information Theory (ISIT), 2020
We consider a scenario wherein two parties Alice and Bob are provided $X_1^n$ and $X_2^n$ - samples that are IID from a PMF ${p_{{X_{\text{1}}}{X_{\text{2}}}}}$. Alice and Bob can communicate to Charles over (noiseless) communication links of rate R1 and R2 respectively. Their goal is to enable Charles generate samples Yn such that the triple $\left( {X_1^n,X_2^n,{Y^n}} \right)$ has a PMF that is close, in total variation, to $\prod {{p_{{X_1}{X_2}Y}}} $. In addition, the three parties may posses shared common randomness at rate C. We address the problem of characterizing the set of rate triples (R1, R2,C) for which the above goal can be accomplished. We provide a set of sufficient conditions, i.e., an achievable rate region for this three party setup. Our work also provides a complete characterization of a point-to-point setup wherein Bob is absent and Charles is provided with side-information.
IEEE Transactions on Information Theory, 1999
This work provides techniques to apply the channel coding theorem and the resulting error exponent, which was originally derived for totally random block-code ensembles, to ensembles of codes with less restrictive randomness demands. As an example, the random coding technique can even be applied for an ensemble that contains a single code. For a specific linear code, we get an upper bound for the error probability, which equals Gallager's random-coding bound, up to a factor determined by the maximum ratio between the weight distribution of the code, and the expected random weight distribution.
In near lossless source coding with decoder only side information, i.e., Slepian-Wolf coding (with one encoder), a source X with finite alphabet X is first encoded, and then later decoded subject to a small error probability with the help of side information Y with finite alphabet Y available only to the decoder. The classical result by Slepian and Wolf shows that the minimum average compression rate achievable asymptotically subject to a small error probability constraint for a memoryless pair (X , Y) is given by the conditional entropy H(X|Y). In this paper, we look beyond conditional entropy and investigate the tradeoff between compression rate and decoding error spectrum in Slepian-Wolf coding when the decoding error probability goes to zero exponentially fast. It is shown that when the decoding error probability goes to zero at the speed of 2-deltan, where delta is a positive constant and n denotes the source sequences' length, the minimum average compression rate achievab...
European Transactions on Telecommunications, 1993
We show that there is a universal noiseless source code for the class of all countably infinite memoryless sources for which a fixed given uniquely decodable code has finite expected codeword length. This source code is derived from a class of distribution estimation procedures which are consistent in expected information divergence.
Revista AIBR, 2021
Cahiers de littérature orale, 2021
Editora Matrioska, 1ª edição, 2021
Il sedile quale componente integrato nei sistemi di sicurezza dell'automobile, 1994
Olhares: Revista do Departamento de Educação da Unifesp, 2021
Recherches & éducations
Global Responsibility to Protect, 2010
Frontiers in Human Neuroscience, 2014
Journal of Shaheed Suhrawardy Medical College, 2013
ChemInform, 2015
Physics Letters, 1984
International Journal of Public Health Science (IJPHS), 2021
Neural Computing and Applications, 2019
The American Journal of Tropical Medicine and Hygiene, 2002
Journal of Physical and Chemical Reference Data, 2005