Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
…
6 pages
1 file
The primary aim of this project is to implement techniques for fingerprint image enhancement and minutiae extraction. Recognition of people by means of their biometric characteristics very popular among the society. But a fingerprint image consists of enormous amount of data. For a given whole fingerprint, divide it into small blocks called patches. Obtaining an over complete dictionary from a set of fingerprint patches allows us to represent them as a sparse linear combination of dictionary atoms. In the algorithm, we first construct a dictionary for predefined fingerprint image patches. Large volume of fingerprint is collected and stored everyday in a wide range of applications. The experiments demonstrate that this is efficient compared with several competing compression techniques especially at high compression ratios. There are many image compression techniques available. Fingerprint images are rarely of perfect quality. There are many image compression techniques available. JPEG, JPEG 2000, Wavelet Scalar Quantization (WSQ) are the existing image compression techniques. The JPEG, JPEG 2000 methods are for general image compression. Fingerprint identification methods are widely used by police agencies and customhouse to identify criminals or transit passengers since the late nineteenth century. ISO standardized the characteristics of the fingerprint in 2004.
In this paper, we have introduced a new fingerprint compression algorithm which is dependent on sparse representation. By acquiring an over-complete dictionary from a set of fingerprint patches, it lets us to exhibit them as sparse linear combination of the dictionary atoms. First of all, we represented a dictionary in the algorithm for predefined fingerprint image patches. Then exhibit the patches for a current given fingerprint images on the basis of the dictionary by reckoning l0-minimization and later quantize and encode the representation. Thus in this experiment, we observed the effect of different factors on the results of compression. The experiments show that our algorithm is efficient as compared with other competing compression techniques (WSQ, JPEG and JPEG 2000), especially at high compression ratios.
British Journal of Mathematics & Computer Science, 2014
A data compression algorithm is a signal processing technique used to convert data from a large format to one optimized for compactness. Huge volumes of fingerprint images that need to be transmitted over a network of biometric databases are an excellent example of why data compression is important. The cardinal goal of image compression is to obtain the best possible image quality at a reduced storage and transmission bandwidth costs. In this paper, a review of different methodological approaches to fingerprint image compression based on the wavelet algorithm is conducted. From the survey of the existing wavelet-based image compression methods, the problems that have been identified include: the limitation of WSQ standard to a compression ratio of 15:1 which could be improved with better algorithm. High complexity of image encoding process of the existing techniques is also a problem. Most of the existing methods require the generation of codebooks or lookup tables which require additional computational cost for implementation. Additionally, significant degradation in the biometric features of fingerprint at compression ratio higher than 15:1 remains a major challenge. Therefore, the investigation of an efficient compression method that can significantly reduce fingerprint image size while preserving its biometric properties (the core, ridge endings and bifurcations) is justified.
2015
Fingerprint is the most famous biometric that is mostly used in various authentication systems. Fingerprint of human being exhibit certain details marked on it, categorized it as minutiae, which can be used as a unique identity of a person if recognize it in a well manner. Large volume of fingerprint are collected and stored every day in a varied range of applications. In this context, the compression of these data may become commanding under certain circumstances due to the large amounts of data involved. This paper present the characteristics of biometric features of fingerprint image, talk over the necessity of fingerprint compression and present abstract review on different existing fingerprint compression standards such as DCT based JPEG, DWT based JPEG-2000, WSQ, etc. and the proposed method is given which is implementation of fingerprint image compression by using the method of sparse representation.review on different existing fingerprint compression standards such as DCT ba...
SUMMARY A new compression algorithm for fingerprint images is introduced. A modified wavelet packet scheme which uses a fixed decomposition structure, matched to the statistics of fingerprint images. is used. Based on statistkal studies of the sub-bands. dilferent compression techniques are chosen for different subbands. The decision is based on the etfect of each subband on reconstructed image, taking into account the characteristics of the Human Visual System (HVS). A noise shaping bit allocation procedure which considers the HVS. is then used to assign the bit rate among subbands. Using Lattice Vector Quantization (LVQ), a new technique for determining the largest radius ofthe Lattice and its scaling factor is presented. The design is based on obtaining the smallest possible Expected Total Distortion (ETD) measure. using the given bit budget. At low bit rates. for the coefficients with high-frequency content. we propose the Positire-Negatire Mean (PNM) algorithm to improve the resolution of the reconstructed image. Furthermore. for the coefficients with low-frequency content. a loss/ess predicth·e compression scheme is developed. The proposed algorithm results in a high compression ratio and a high reconstructed image quality with a low computational load compared to other available algorithms. key words: fingerprint compression. warelet packets, pyramid Lattice rector quanti:ation.
2009
Modified Set Partitioning in Hierarchical Tree with Run Length Encoding is a new framework proposed for fingerprint image compression. The Proposed method is better because more number of images related to the fingerprint image are retrieved. Experiments on an image database of grayscale bitmap images show that the proposed technique performs well in compression and decompression. We use Peak Signal to noise ratio [3] and Mean Square Error [3] to compute the picture quality of fingerprint images.
IEEE transactions on image processing : a publication of the IEEE Signal Processing Society, 2002
A novel compression algorithm for fingerprint images is introduced. Using wavelet packets and lattice vector quantization , a new vector quantization scheme based on an accurate model for the distribution of the wavelet coefficients is presented. The model is based on the generalized Gaussian distribution. We also discuss a new method for determining the largest radius of the lattice used and its scaling factor , for both uniform and piecewise-uniform pyramidal lattices. The proposed algorithms aim at achieving the best rate-distortion function by adapting to the characteristics of the subimages. In the proposed optimization algorithm, no assumptions about the lattice parameters are made, and no training and multi-quantizing are required. We also show that the wedge region problem encountered with sharply distributed random sources is resolved in the proposed algorithm. The proposed algorithms adapt to variability in input images and to specified bit rates. Compared to other availab...
1996
Abstract A new compression algorithm for fingerprint images is introduced. A modified wavelet packet scheme which uses a fixed decomposition structure, matched to the statistics of fingerprint images, is presented. A technique for determining the most important coefficients is introduced. The algorithm uses both hard and soft thresholding schemes to make the procedure fast and efficient. The bit allocation for each subimage of the modified coefficients is determined.
1996
This paper presents a new compression algorithm for fingerprint images. A modified wavelet packet scheme based on a fixed decomposition structure, matched to the statistics of fingerprint images, is used to decorrelate the image pixels. Rather than using Hugman coding directly on the quantized coefficients, a hard thresholdingand a rounding scheme are applied to the modified mean-removed coefficients, followed by entropy coding.
2024
Fingerprints are used as unique identity of people for very long time. The large numbers of fingerprints are collected and stored everyday in a wide range of applications. Since large volume of fingerprint data stored in a database consumes more amount of memory. In this paper, a discrete wavelet transform (DWT) technique for fingerprint image compression is introduced. In the present work, DWT compression technique is applied to the different fingerprint images. The number of wavelets of DWT family is employed for this purpose. The fingerprint image under consideration is firstly decomposed by coding technique of DWT and then applied to embedded zerotree wavelet (EZW) encoding scheme. A comparative study is done on all the resultant images of fingerprint in terms of Mean Square Error (MSE), Peak Signal to Noise Ratio (PSNR) and compression ratio (CR). In the present study, Haar, Daubechies(db2) and Symlets(sym2) wavelets gives the better compression of fingerprint images till third level of wavelet decomposition. All the work is done by well known mathematical tool MATLAB.
Signal Processing, 1997
In this paper, a novel quantization scheme for the wavelet coefficients is introduced. Using the wavelet packet transform ( WPT) and lattice vector quantization (LVQ), we present here a new lattice optimization scheme based on an accurate model for the distribution of the wavelet coefficients. The model is based on the generalized gaussian distribution (GGD). A least squares algorithm on a non-linear function of the shape parameter is formulated to estimate the model parameters. The proposed algorithm adapts to non-stationarity in input images and to given bit rates. Compared to other wavelet-based algorithms, the technique proposed here results in higher reconstructed image qualities for identical bit rates. @ 1997 Published by Elsevier Science B.V.
HAL (Le Centre pour la Communication Scientifique Directe), 2012
Proceedings of Spie the International Society For Optical Engineering, 2000
Journal of Applied Ichthyology, 1994
Europhysics Letters (EPL), 1995
IEEE Transactions on Applied Superconductivity, 2000
Elsevier, Developments in Sedimentology, Vol. 64. , 2012
International Journal of Modern Physics C, 2006
Http Dx Doi Org 10 1080 14786410802038556, 2008
Explanation-aware Computing, 2008
Cultural Studies of Science Education, 2008
Developmental Biology, 2007
Pedro & João, 2022
Advances in logistics, operations, and management science book series, 2016
Applied Cognitive Psychology, 2021
Trends in ecology & evolution, 2015