The assessment of iris uniqueness plays a crucial role in analyzing the capabilities and limitati... more The assessment of iris uniqueness plays a crucial role in analyzing the capabilities and limitations of iris recognition systems. Among the various methodologies proposed, Daugman's approach to iris uniqueness stands out as one of the most widely accepted. According to Daugman, uniqueness refers to the iris recognition system's ability to enroll an increasing number of classes while maintaining a near-zero probability of collision between new and enrolled classes. Daugman's approach involves creating distinct IrisCode templates for each iris class within the system and evaluating the sustainable population under a fixed Hamming distance between codewords. In our previous work [23], we utilized Rate-Distortion Theory (as it pertains to the limits of error-correction codes) to establish boundaries for the maximum possible population of iris classes supported by Daugman's IrisCode, given the constraint of a fixed Hamming distance between codewords. Building upon that research, we propose a novel methodology to evaluate the scalability of an iris recognition system, while also measuring iris quality. We achieve this by employing a sphere-packing bound for Gaussian codewords and adopting a approach similar to Daugman's, which utilizes relative entropy as a distance measure between iris classes. To demonstrate the efficacy of our methodology, we illustrate its application on two small datasets of iris images. We determine the sustainable maximum population for each dataset based on the quality of the images. By providing these illustrations, we aim to assist researchers in comprehending the limitations inherent in their recognition systems, depending on the quality of their iris databases.
Maintenance & Repair costs in heavy-duty trucks are an important component of the total cost of o... more Maintenance & Repair costs in heavy-duty trucks are an important component of the total cost of ownership. Due to the very limited availability of real-time data collected from medium-and heavy-duty vehicles using alternative fuels, this topic has not been well studied resulting in a very slow diffusion of alternative fuel vehicles in the market. This study focuses on collecting maintenance data related to diesel and alternative fuels such as natural gas and propane for the school bus, delivery truck, vocational truck, refuse truck, goods movement truck, and transit bus. The novelty of this work lies in identifying the mixed effects in the maintenance data and using a mixed-effect model for developing a single prediction model on clustered longitudinal data. A mixed-effect random forest machine learning model is trained on the maintenance data for estimating the average cost per mile. The model achieved an R 2 of 98.96% with a mean square error of 0.0089 $/mile for training and an R 2 of 94.31% with a mean square error of 0.0312 $/mile for the validation dataset. The prediction model is evaluated on each cluster of data and observed to perform well capturing the variations in each cluster very well. Furthermore, the performance of the mixed-effect random forest model is compared with the XGBoost ensemble model. INDEX TERMS Heavy-duty vehicles, vocational trucks, alternative fuel, diesel, maintenance and repair cost, mixed effect model, mixed effect random forest.
The maintenance costs can represent about 15%–60% of the cost of produced goods depending on the ... more The maintenance costs can represent about 15%–60% of the cost of produced goods depending on the type of goods transported. To comply with stringent emissions regulations, diesel engines are incorporated with complex after-treatment systems that demand increased maintenance. The availability of alternative fuels such as natural gas and propane has fostered the natural gas and propane powertrain systems as well as electrification options for heavy- and medium-duty vehicles. A critical barrier to adopting alternative fuel vehicles has been the lack of knowledge on comparative vehicle maintenance/repair costs with conventional diesel. Moreover, the region of operation, the type of vehicle operation, and seasonal temperature changes also affect the duty cycle which impacts the maintenance and repair costs. This study focuses on estimating the cost-per-mile for heavy-duty vehicles using machine learning models such as random forest, xgboost, neural networks, and a super-learner model. Th...
IEEE Transactions on Biometrics, Behavior, and Identity Science
Iris is an established modality in biometric recognition applications including consumer electron... more Iris is an established modality in biometric recognition applications including consumer electronics, e-commerce, border security, forensics, and de-duplication of identity at a national scale. In light of the expanding usage of biometric recognition, identity clash (when templates from two different people match) is an imperative factor of consideration for a system's deployment. This study explores system capacity estimation by empirically estimating the constrained capacity of an end-to-end iris recognition system (NIR systems with Daugman-based feature extraction) operating at an acceptable error rate i.e. the number of subjects a system can resolve before encountering an error. We study the impact of six system parameters on an iris recognition system's constrained capacity-number of enrolled identities, image quality, template dimension, random feature elimination, filter resolution, and system operating point. In our assessment, we analyzed 13.2 million comparisons from 5158 unique identities for each of 24 different system configurations. This work provides a framework to better understand iris recognition system capacity as a function of biometric system configurations beyond the operating point, for large-scale applications.
Cross-spectral face verification between short-wave infrared (SWIR) and visible light (VIS) face ... more Cross-spectral face verification between short-wave infrared (SWIR) and visible light (VIS) face images poses a challenge, which is motivated by various real-world applications such as surveillance at night time or in harsh environments. This paper proposes a hybrid solution that takes advantage of both traditional feature engineering and modern deep learning techniques to overcome the issue of limited imagery as encountered in the SWIR band. Firstly, the paper revisits the theory of measurement levels. Then, two new operators are introduced which act at the nominal and interval levels of measurement and are named the Nominal Measurement Descriptor (NMD) and the Interval Measurement Descriptor (IMD), respectively. A composite operator Gabor Multiple-Level Measurement (GMLM) is further proposed which fuses multiple levels of measurement. Finally, the fused features of GMLM are passed through a succinct and efficient neural network based on PCA. The network selects informative feature...
2018 IEEE Global Conference on Signal and Information Processing (GlobalSIP), 2018
This paper analyzes the potential of the Spatial Fourier transform (SFT) for detection of a perio... more This paper analyzes the potential of the Spatial Fourier transform (SFT) for detection of a periodic astrophysical signal and for estimation of parameters of the signal. In place of de-dispersing filter bank data for each Dispersion Measure (DM) trial and then integrating over frequency channels to yield a one-dimensional signal, we apply SFT to filter bank data, then detect periodic astrophysical signals and analyze their parameters such as DM and rotational period. This approach allows searching for periodic astrophysical signals in real time. Its complexity is dominated by the complexity of the SFT. The results of our analysis show promise. Using simulated data we demonstrate that it takes about 3 minutes of observation time to detect a pulsar at an S/N value of 8σ. The SFT data also provide information about the rotation of pulsars and lower and upper bounds on their DM value.
Pattern recognition problems are analyzed by applying model-based likelihood approaches, with the... more Pattern recognition problems are analyzed by applying model-based likelihood approaches, with the major emphasis placed on the asymptotic and numerical analysis of the average probability of recognition error. We consider two types of recognition systems: (i) systems with known models for the observed data; and (ii) systems with some unknown parameters in distributions of the observed data, designed using a limited amount of training data. The analysis for the first problem is motivated by a growing commercial interest in verification and identification systems. Our analysis is based on the notion of physical signature. The signatures are modeled as realizations of a random process with known statistics. The focus of performance analysis for identification systems is placed on determining their capabilities. The test statistic for this problem is a vector of information densities. For identification systems with the number of templates growing exponentially, capacity of the system i...
Iris biometric is one of the most reliable biometrics with respect to performance. However, this ... more Iris biometric is one of the most reliable biometrics with respect to performance. However, this reliability is a function of the ideality of the data. One of the most important steps in processing nonideal data is reliable and precise segmentation of the iris pattern from remaining background. In this paper, a segmentation methodology that aims at compensating various nonidealities contained in iris images during segmentation is pro- posed. The virtue of this methodology lies in its capability to reliably segment nonideal imagery that is simultaneously affected with such factors as specular reflection, blur, lighting variation, occlusion, and off-angle images. We demonstrate the robustness of our segmentation methodology by evaluating ideal and nonideal data sets, namely, the Chinese Academy of Sciences iris data version 3 interval subdirectory, the iris challenge evaluation data, the West Virginia University (WVU) data, and the WVU off-angle data. Furthermore, we compare our perfo...
2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2017
This paper presents a blind algorithm for the automatic detection of isolated astrophysical pulse... more This paper presents a blind algorithm for the automatic detection of isolated astrophysical pulses. The detection algorithm is applied to spectrograms (also known as “filter bank data” or “the (t,f) plane”). The detection algorithm comprises a sequence of three steps: (1) a Radon transform is applied to the spectrogram, (2) a Fourier transform is applied to each projection parametrized by an angle, and the total power in each projection is calculated, and (3) the total power of all projections above 90° is compared to the total power of all projections below 90° and a decision in favor of an astrophysical pulse present or absent is made. Once a pulse is detected, its Dispersion Measure (DM) is estimated by fitting an analytically developed expression for a transformed spectrogram containing a pulse, with varying value of DM, to the actual data. The performance of the proposed algorithm is numerically analyzed.
Journal of Astronomical Telescopes, Instruments, and Systems, 2020
Abstract. We present an algorithm for the detection of candidate astronomical pulses. It is imple... more Abstract. We present an algorithm for the detection of candidate astronomical pulses. It is implemented in several steps. First, a spectrogram of a dispersed astronomical pulse is linearized in observing frequency followed by application of the Radon transform. The result of the transformation is displayed as a two-dimensional function. Next, the function is smoothed using a spatial low-pass filter. Finally, the maximum of the function above 90-deg angle is compared to the maximum of the standard deviation of the noise below 90-deg angle and a decision in favor of an astronomical pulse present or absent is made. Once pulse is detected, its dispersion measure (DM) is estimated by means of a basic equation relating the slope of the linearized dispersed pulse and the DM value. Performance of the algorithm is analyzed by applying it to a set of simulated fast radio bursts, experimental data of Masui pulse, and of seven rotating radio transients. The detection algorithm demonstrates results comparable to those by the conventional pulse detection algorithm.
Shown are three histogram plots of HDs obtained by varying parameters in one of three major param... more Shown are three histogram plots of HDs obtained by varying parameters in one of three major parameter groups. The rightmost histogram (marked in the dark color) was obtained when images generated by varying "FIBER" parameters were compared. The histogram in the middle (marked in the gray color) was obtained by comparing images with random "COLLARETTE" parameters. The leftmost histogram (marked in the light color) was generated by involving images generated by varying a set of "BASIC" parameters.. .. 2.8 A gallery of synthetic iris images generated using model based, anatomy based approach. Iris 4 is a real iris image borrowed from the CASIA data set.. .. 2.9 Shown are three unwrapped and enhanced iris images. The images are samples from (a) CASIA data set, (b) WVU non-ideal iris data set, and (c) data set of synthetic irises generated using our model based approach.. .. .. .. .. 2.10 Original images (left column) and their real components of log-Gabor filtered results (right column
Matching facial images acquired in different electromagnetic spectral bands remains a challenge. ... more Matching facial images acquired in different electromagnetic spectral bands remains a challenge. An example of this type of comparison is matching active or passive infrared (IR) against a gallery of visible face images. When combined with cross-distance, this problem becomes even more challenging due to deteriorated quality of the IR data. As an example, we consider a scenario where visible light images are acquired at a short standoff distance while IR images are long range data. To address the difference in image quality due to atmospheric and camera effects, typical degrading factors observed in long range data, we propose two approaches that allow to coordinate image quality of visible and IR face images. The first approach involves Gaussian-based smoothing functions applied to images acquired at a short distance (visible light images in the case we analyze). The second approach involves denoising and enhancement applied to low quality IR face images. A quality measure tool called Adaptive Sharpness Measure is utilized as guidance for the quality parity process, which is an improvement of the famous Tenengrad method. For recognition algorithm, a composite operator combining Gabor filters, Local Binary Patterns (LBP), generalized LBP and Weber Local Descriptor (WLD) is used. The composite operator encodes both magnitude and phase responses of the Gabor filters. The combining of LBP and WLD utilizes both the orientation and intensity information of edges. Different IR bands, short-wave infrared (SWIR) and near-infrared (NIR), and different long standoff distances are considered. The experimental results show that in all cases the proposed technique of image quality parity (both approaches) benefits the final recognition performance.
Matching images acquired in different electromagnetic bands remains a challenging problem. An exa... more Matching images acquired in different electromagnetic bands remains a challenging problem. An example of this type of comparison is matching active or passive infrared (IR) against a gallery of visible face images, known as cross-spectral face recognition. Among many unsolved issues is the one of quality disparity of the heterogeneous images. Images acquired in different spectral bands are of unequal image quality due to distinct imaging mechanism, standoff distances, or imaging environment, etc. To reduce the effect of quality disparity on the recognition performance, one can manipulate images to either improve the quality of poor-quality images or to degrade the high-quality images to the level of the quality of their heterogeneous counterparts. To estimate the level of discrepancy in quality of two heterogeneous images a quality metric such as image sharpness is needed. It provides a guidance in how much quality improvement or degradation is appropriate. In this work we consider sharpness as a relative measure of heterogeneous image quality. We propose a generalized definition of sharpness by first achieving image quality parity and then finding and building a relationship between the image quality of two heterogeneous images. Therefore, the new sharpness metric is named heterogeneous sharpness. Image quality parity is achieved by experimentally finding the optimal cross-spectral face recognition performance where quality of the heterogeneous images is varied using a Gaussian smoothing function with different standard deviation. This relationship is established using two models; one of them involves a regression model and the other involves a neural network. To train, test and validate the model, we use composite operators developed in our lab to extract features from heterogeneous face images and use the sharpness metric to evaluate the face image quality within each band. Images from three different spectral bands visible light, near infrared, and short-wave infrared are considered in this work. Both error of a regression model and validation error of a neural network are analyzed.
Abstract. Cross-spectral image matching is a challenging research problem motivated by various ap... more Abstract. Cross-spectral image matching is a challenging research problem motivated by various applications, including surveillance, security, and identity management in general. An example of this problem includes cross-spectral matching of active infrared (IR) or thermal IR face images against a dataset of visible light images. A summary of recent developments in the field of cross-spectral face recognition by the authors is presented. In particular, it describes the original form and two variants of a local operator named composite multilobe descriptor (CMLD) for facial feature extraction with the purpose of cross-spectral matching of near-IR, short-wave IR, mid-wave IR, and long-wave IR to a gallery of visible light images. The experiments demonstrate that the variants of CMLD outperform the original CMLD and other recently developed composite operators used for comparison. In addition to different IR spectra, various standoff distances from close-up (1.5 m) to intermediate (50 m) and long (106 m) are also investigated. Performance of CMLD I to III is evaluated for each of the three cases of distances. The newly developed operators, CMLD I to III, are further utilized to conduct a study on cross-spectral partial face recognition where different facial regions are compared in terms of the amount of useful information they contain for the purpose of conducting cross-spectral face recognition. The experimental results show that among three facial regions considered in the experiments the eye region is the most informative for all IR spectra at all standoff distances.
2016 United States National Committee of URSI National Radio Science Meeting (USNC-URSI NRSM), 2016
This paper presents two methods for the blind detection of Fast Radio Bursts (FRBs). The methods ... more This paper presents two methods for the blind detection of Fast Radio Bursts (FRBs). The methods are applied to filter bank data (also known as a spectrogram or (t, f) plane). The methods are designed to detect FRB signatures in the 1D Fourier Transforms (1DFFT) and in Spatial Fourier Transforms (2DFFT) of the (t, f) data. The signature of a FRB in the 2DFFT plane is a line with a slope between zero and 90° (but not zero or 90°) in the range of low spatial frequencies. In the sum of magnitudes of the 1DFFT of each frequency channel in the (t, f) plane, the signature of a FRB appears as a relatively strong and approximately Gaussian bell shaped pulse centered around zero frequency. The detection methods are different approaches to detect these signatures. To detect the line in the 2DFFT, we use a Hough transform. To detect the presence of an FRB signature in the 1DFFT approach, we evaluate the total power in the data and compare it to a decision threshold. The paper presents a performance analysis of each detection method.
A new compound operator is proposed for heterogeneous periocular recognition.The new operator out... more A new compound operator is proposed for heterogeneous periocular recognition.The new operator outperforms three basic operators and three state-of-the-art compound operators.NIR, SWIR, MWIR and LWIR spectra at different standoff distances are considered.A metric is introduced to measure image quality and explain its impact on performance.The new operator does not require training and can be applied to various datasets. Cross-spectral matching of active and passive infrared (IR) periocular images to a visible light periocular image gallery is a challenging research problem. This scenario is motivated by a number of surveillance applications such as recognition of subjects at night or in harsh environmental conditions. This problem becomes even more challenging with a varying standoff distance. To address this problem a new compound operator named GWLH that fuses three local descriptors - Histogram of Gradients (HOG), Local Binary Patterns (LBP) and Weber Local Descriptors (WLD) - applied to the outputs of Gabor filters is proposed. The local operators encode both magnitude and phase information. When applied to periocular regions, GWLH outperforms other compound operators that recently appeared in the literature. During performance evaluation LBP, Gabor filters, HOG, and a fusion of HOG and LBP establish a baseline for the performance comparison, while other compound operators such as Gabor followed by HOG and LBP as well as Gabor followed by WLD, LBP and GLBP present the state-of-the-art. The active IR band is presented by short-wave infrared (SWIR) and near-infrared (NIR) and passive IR is presented by mid-wave infrared (MWIR) and long-wave infrared (LWIR). In addition to varying spectrum, we also vary the standoff distance of SWIR and NIR probes. In all but one case of the combination of spectrum and range, GWLH outperforms all the other operators. A sharpness metric is introduced to measure the quality of heterogeneous periocular images and to emphasize the need in development of image enhancement approaches for heterogeneous periocular biometrics. Based on the statistics of the sharpness metric, the performance difference between compound and single operators is increasing proportionally with increasing sharpness metric values.
2010 Fourth IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS), 2010
Publications Chair Patrick J. Flynn, University of Notre Dame, USA ... Mohamed Abdel-Mottaleb, Un... more Publications Chair Patrick J. Flynn, University of Notre Dame, USA ... Mohamed Abdel-Mottaleb, University of Miami, USA George Bebis, University of Nevada – Reno, USA Olga Bellon, Universidade Federal do Parana, Brazil Ross Beveridge, Colorado State University, USA Vijyaykumar Bhagavatula, Carnegie-Mellon University, USA Bir Bhanu, University of California – Riverside, USA Wageeh Boles, Queensland University of Technology, Australia Julien Bringer, Morpho, France Mark Burge, MITRE, USA Patrizio Campisi, Universita degli ...
The assessment of iris uniqueness plays a crucial role in analyzing the capabilities and limitati... more The assessment of iris uniqueness plays a crucial role in analyzing the capabilities and limitations of iris recognition systems. Among the various methodologies proposed, Daugman's approach to iris uniqueness stands out as one of the most widely accepted. According to Daugman, uniqueness refers to the iris recognition system's ability to enroll an increasing number of classes while maintaining a near-zero probability of collision between new and enrolled classes. Daugman's approach involves creating distinct IrisCode templates for each iris class within the system and evaluating the sustainable population under a fixed Hamming distance between codewords. In our previous work [23], we utilized Rate-Distortion Theory (as it pertains to the limits of error-correction codes) to establish boundaries for the maximum possible population of iris classes supported by Daugman's IrisCode, given the constraint of a fixed Hamming distance between codewords. Building upon that research, we propose a novel methodology to evaluate the scalability of an iris recognition system, while also measuring iris quality. We achieve this by employing a sphere-packing bound for Gaussian codewords and adopting a approach similar to Daugman's, which utilizes relative entropy as a distance measure between iris classes. To demonstrate the efficacy of our methodology, we illustrate its application on two small datasets of iris images. We determine the sustainable maximum population for each dataset based on the quality of the images. By providing these illustrations, we aim to assist researchers in comprehending the limitations inherent in their recognition systems, depending on the quality of their iris databases.
Maintenance & Repair costs in heavy-duty trucks are an important component of the total cost of o... more Maintenance & Repair costs in heavy-duty trucks are an important component of the total cost of ownership. Due to the very limited availability of real-time data collected from medium-and heavy-duty vehicles using alternative fuels, this topic has not been well studied resulting in a very slow diffusion of alternative fuel vehicles in the market. This study focuses on collecting maintenance data related to diesel and alternative fuels such as natural gas and propane for the school bus, delivery truck, vocational truck, refuse truck, goods movement truck, and transit bus. The novelty of this work lies in identifying the mixed effects in the maintenance data and using a mixed-effect model for developing a single prediction model on clustered longitudinal data. A mixed-effect random forest machine learning model is trained on the maintenance data for estimating the average cost per mile. The model achieved an R 2 of 98.96% with a mean square error of 0.0089 $/mile for training and an R 2 of 94.31% with a mean square error of 0.0312 $/mile for the validation dataset. The prediction model is evaluated on each cluster of data and observed to perform well capturing the variations in each cluster very well. Furthermore, the performance of the mixed-effect random forest model is compared with the XGBoost ensemble model. INDEX TERMS Heavy-duty vehicles, vocational trucks, alternative fuel, diesel, maintenance and repair cost, mixed effect model, mixed effect random forest.
The maintenance costs can represent about 15%–60% of the cost of produced goods depending on the ... more The maintenance costs can represent about 15%–60% of the cost of produced goods depending on the type of goods transported. To comply with stringent emissions regulations, diesel engines are incorporated with complex after-treatment systems that demand increased maintenance. The availability of alternative fuels such as natural gas and propane has fostered the natural gas and propane powertrain systems as well as electrification options for heavy- and medium-duty vehicles. A critical barrier to adopting alternative fuel vehicles has been the lack of knowledge on comparative vehicle maintenance/repair costs with conventional diesel. Moreover, the region of operation, the type of vehicle operation, and seasonal temperature changes also affect the duty cycle which impacts the maintenance and repair costs. This study focuses on estimating the cost-per-mile for heavy-duty vehicles using machine learning models such as random forest, xgboost, neural networks, and a super-learner model. Th...
IEEE Transactions on Biometrics, Behavior, and Identity Science
Iris is an established modality in biometric recognition applications including consumer electron... more Iris is an established modality in biometric recognition applications including consumer electronics, e-commerce, border security, forensics, and de-duplication of identity at a national scale. In light of the expanding usage of biometric recognition, identity clash (when templates from two different people match) is an imperative factor of consideration for a system's deployment. This study explores system capacity estimation by empirically estimating the constrained capacity of an end-to-end iris recognition system (NIR systems with Daugman-based feature extraction) operating at an acceptable error rate i.e. the number of subjects a system can resolve before encountering an error. We study the impact of six system parameters on an iris recognition system's constrained capacity-number of enrolled identities, image quality, template dimension, random feature elimination, filter resolution, and system operating point. In our assessment, we analyzed 13.2 million comparisons from 5158 unique identities for each of 24 different system configurations. This work provides a framework to better understand iris recognition system capacity as a function of biometric system configurations beyond the operating point, for large-scale applications.
Cross-spectral face verification between short-wave infrared (SWIR) and visible light (VIS) face ... more Cross-spectral face verification between short-wave infrared (SWIR) and visible light (VIS) face images poses a challenge, which is motivated by various real-world applications such as surveillance at night time or in harsh environments. This paper proposes a hybrid solution that takes advantage of both traditional feature engineering and modern deep learning techniques to overcome the issue of limited imagery as encountered in the SWIR band. Firstly, the paper revisits the theory of measurement levels. Then, two new operators are introduced which act at the nominal and interval levels of measurement and are named the Nominal Measurement Descriptor (NMD) and the Interval Measurement Descriptor (IMD), respectively. A composite operator Gabor Multiple-Level Measurement (GMLM) is further proposed which fuses multiple levels of measurement. Finally, the fused features of GMLM are passed through a succinct and efficient neural network based on PCA. The network selects informative feature...
2018 IEEE Global Conference on Signal and Information Processing (GlobalSIP), 2018
This paper analyzes the potential of the Spatial Fourier transform (SFT) for detection of a perio... more This paper analyzes the potential of the Spatial Fourier transform (SFT) for detection of a periodic astrophysical signal and for estimation of parameters of the signal. In place of de-dispersing filter bank data for each Dispersion Measure (DM) trial and then integrating over frequency channels to yield a one-dimensional signal, we apply SFT to filter bank data, then detect periodic astrophysical signals and analyze their parameters such as DM and rotational period. This approach allows searching for periodic astrophysical signals in real time. Its complexity is dominated by the complexity of the SFT. The results of our analysis show promise. Using simulated data we demonstrate that it takes about 3 minutes of observation time to detect a pulsar at an S/N value of 8σ. The SFT data also provide information about the rotation of pulsars and lower and upper bounds on their DM value.
Pattern recognition problems are analyzed by applying model-based likelihood approaches, with the... more Pattern recognition problems are analyzed by applying model-based likelihood approaches, with the major emphasis placed on the asymptotic and numerical analysis of the average probability of recognition error. We consider two types of recognition systems: (i) systems with known models for the observed data; and (ii) systems with some unknown parameters in distributions of the observed data, designed using a limited amount of training data. The analysis for the first problem is motivated by a growing commercial interest in verification and identification systems. Our analysis is based on the notion of physical signature. The signatures are modeled as realizations of a random process with known statistics. The focus of performance analysis for identification systems is placed on determining their capabilities. The test statistic for this problem is a vector of information densities. For identification systems with the number of templates growing exponentially, capacity of the system i...
Iris biometric is one of the most reliable biometrics with respect to performance. However, this ... more Iris biometric is one of the most reliable biometrics with respect to performance. However, this reliability is a function of the ideality of the data. One of the most important steps in processing nonideal data is reliable and precise segmentation of the iris pattern from remaining background. In this paper, a segmentation methodology that aims at compensating various nonidealities contained in iris images during segmentation is pro- posed. The virtue of this methodology lies in its capability to reliably segment nonideal imagery that is simultaneously affected with such factors as specular reflection, blur, lighting variation, occlusion, and off-angle images. We demonstrate the robustness of our segmentation methodology by evaluating ideal and nonideal data sets, namely, the Chinese Academy of Sciences iris data version 3 interval subdirectory, the iris challenge evaluation data, the West Virginia University (WVU) data, and the WVU off-angle data. Furthermore, we compare our perfo...
2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2017
This paper presents a blind algorithm for the automatic detection of isolated astrophysical pulse... more This paper presents a blind algorithm for the automatic detection of isolated astrophysical pulses. The detection algorithm is applied to spectrograms (also known as “filter bank data” or “the (t,f) plane”). The detection algorithm comprises a sequence of three steps: (1) a Radon transform is applied to the spectrogram, (2) a Fourier transform is applied to each projection parametrized by an angle, and the total power in each projection is calculated, and (3) the total power of all projections above 90° is compared to the total power of all projections below 90° and a decision in favor of an astrophysical pulse present or absent is made. Once a pulse is detected, its Dispersion Measure (DM) is estimated by fitting an analytically developed expression for a transformed spectrogram containing a pulse, with varying value of DM, to the actual data. The performance of the proposed algorithm is numerically analyzed.
Journal of Astronomical Telescopes, Instruments, and Systems, 2020
Abstract. We present an algorithm for the detection of candidate astronomical pulses. It is imple... more Abstract. We present an algorithm for the detection of candidate astronomical pulses. It is implemented in several steps. First, a spectrogram of a dispersed astronomical pulse is linearized in observing frequency followed by application of the Radon transform. The result of the transformation is displayed as a two-dimensional function. Next, the function is smoothed using a spatial low-pass filter. Finally, the maximum of the function above 90-deg angle is compared to the maximum of the standard deviation of the noise below 90-deg angle and a decision in favor of an astronomical pulse present or absent is made. Once pulse is detected, its dispersion measure (DM) is estimated by means of a basic equation relating the slope of the linearized dispersed pulse and the DM value. Performance of the algorithm is analyzed by applying it to a set of simulated fast radio bursts, experimental data of Masui pulse, and of seven rotating radio transients. The detection algorithm demonstrates results comparable to those by the conventional pulse detection algorithm.
Shown are three histogram plots of HDs obtained by varying parameters in one of three major param... more Shown are three histogram plots of HDs obtained by varying parameters in one of three major parameter groups. The rightmost histogram (marked in the dark color) was obtained when images generated by varying "FIBER" parameters were compared. The histogram in the middle (marked in the gray color) was obtained by comparing images with random "COLLARETTE" parameters. The leftmost histogram (marked in the light color) was generated by involving images generated by varying a set of "BASIC" parameters.. .. 2.8 A gallery of synthetic iris images generated using model based, anatomy based approach. Iris 4 is a real iris image borrowed from the CASIA data set.. .. 2.9 Shown are three unwrapped and enhanced iris images. The images are samples from (a) CASIA data set, (b) WVU non-ideal iris data set, and (c) data set of synthetic irises generated using our model based approach.. .. .. .. .. 2.10 Original images (left column) and their real components of log-Gabor filtered results (right column
Matching facial images acquired in different electromagnetic spectral bands remains a challenge. ... more Matching facial images acquired in different electromagnetic spectral bands remains a challenge. An example of this type of comparison is matching active or passive infrared (IR) against a gallery of visible face images. When combined with cross-distance, this problem becomes even more challenging due to deteriorated quality of the IR data. As an example, we consider a scenario where visible light images are acquired at a short standoff distance while IR images are long range data. To address the difference in image quality due to atmospheric and camera effects, typical degrading factors observed in long range data, we propose two approaches that allow to coordinate image quality of visible and IR face images. The first approach involves Gaussian-based smoothing functions applied to images acquired at a short distance (visible light images in the case we analyze). The second approach involves denoising and enhancement applied to low quality IR face images. A quality measure tool called Adaptive Sharpness Measure is utilized as guidance for the quality parity process, which is an improvement of the famous Tenengrad method. For recognition algorithm, a composite operator combining Gabor filters, Local Binary Patterns (LBP), generalized LBP and Weber Local Descriptor (WLD) is used. The composite operator encodes both magnitude and phase responses of the Gabor filters. The combining of LBP and WLD utilizes both the orientation and intensity information of edges. Different IR bands, short-wave infrared (SWIR) and near-infrared (NIR), and different long standoff distances are considered. The experimental results show that in all cases the proposed technique of image quality parity (both approaches) benefits the final recognition performance.
Matching images acquired in different electromagnetic bands remains a challenging problem. An exa... more Matching images acquired in different electromagnetic bands remains a challenging problem. An example of this type of comparison is matching active or passive infrared (IR) against a gallery of visible face images, known as cross-spectral face recognition. Among many unsolved issues is the one of quality disparity of the heterogeneous images. Images acquired in different spectral bands are of unequal image quality due to distinct imaging mechanism, standoff distances, or imaging environment, etc. To reduce the effect of quality disparity on the recognition performance, one can manipulate images to either improve the quality of poor-quality images or to degrade the high-quality images to the level of the quality of their heterogeneous counterparts. To estimate the level of discrepancy in quality of two heterogeneous images a quality metric such as image sharpness is needed. It provides a guidance in how much quality improvement or degradation is appropriate. In this work we consider sharpness as a relative measure of heterogeneous image quality. We propose a generalized definition of sharpness by first achieving image quality parity and then finding and building a relationship between the image quality of two heterogeneous images. Therefore, the new sharpness metric is named heterogeneous sharpness. Image quality parity is achieved by experimentally finding the optimal cross-spectral face recognition performance where quality of the heterogeneous images is varied using a Gaussian smoothing function with different standard deviation. This relationship is established using two models; one of them involves a regression model and the other involves a neural network. To train, test and validate the model, we use composite operators developed in our lab to extract features from heterogeneous face images and use the sharpness metric to evaluate the face image quality within each band. Images from three different spectral bands visible light, near infrared, and short-wave infrared are considered in this work. Both error of a regression model and validation error of a neural network are analyzed.
Abstract. Cross-spectral image matching is a challenging research problem motivated by various ap... more Abstract. Cross-spectral image matching is a challenging research problem motivated by various applications, including surveillance, security, and identity management in general. An example of this problem includes cross-spectral matching of active infrared (IR) or thermal IR face images against a dataset of visible light images. A summary of recent developments in the field of cross-spectral face recognition by the authors is presented. In particular, it describes the original form and two variants of a local operator named composite multilobe descriptor (CMLD) for facial feature extraction with the purpose of cross-spectral matching of near-IR, short-wave IR, mid-wave IR, and long-wave IR to a gallery of visible light images. The experiments demonstrate that the variants of CMLD outperform the original CMLD and other recently developed composite operators used for comparison. In addition to different IR spectra, various standoff distances from close-up (1.5 m) to intermediate (50 m) and long (106 m) are also investigated. Performance of CMLD I to III is evaluated for each of the three cases of distances. The newly developed operators, CMLD I to III, are further utilized to conduct a study on cross-spectral partial face recognition where different facial regions are compared in terms of the amount of useful information they contain for the purpose of conducting cross-spectral face recognition. The experimental results show that among three facial regions considered in the experiments the eye region is the most informative for all IR spectra at all standoff distances.
2016 United States National Committee of URSI National Radio Science Meeting (USNC-URSI NRSM), 2016
This paper presents two methods for the blind detection of Fast Radio Bursts (FRBs). The methods ... more This paper presents two methods for the blind detection of Fast Radio Bursts (FRBs). The methods are applied to filter bank data (also known as a spectrogram or (t, f) plane). The methods are designed to detect FRB signatures in the 1D Fourier Transforms (1DFFT) and in Spatial Fourier Transforms (2DFFT) of the (t, f) data. The signature of a FRB in the 2DFFT plane is a line with a slope between zero and 90° (but not zero or 90°) in the range of low spatial frequencies. In the sum of magnitudes of the 1DFFT of each frequency channel in the (t, f) plane, the signature of a FRB appears as a relatively strong and approximately Gaussian bell shaped pulse centered around zero frequency. The detection methods are different approaches to detect these signatures. To detect the line in the 2DFFT, we use a Hough transform. To detect the presence of an FRB signature in the 1DFFT approach, we evaluate the total power in the data and compare it to a decision threshold. The paper presents a performance analysis of each detection method.
A new compound operator is proposed for heterogeneous periocular recognition.The new operator out... more A new compound operator is proposed for heterogeneous periocular recognition.The new operator outperforms three basic operators and three state-of-the-art compound operators.NIR, SWIR, MWIR and LWIR spectra at different standoff distances are considered.A metric is introduced to measure image quality and explain its impact on performance.The new operator does not require training and can be applied to various datasets. Cross-spectral matching of active and passive infrared (IR) periocular images to a visible light periocular image gallery is a challenging research problem. This scenario is motivated by a number of surveillance applications such as recognition of subjects at night or in harsh environmental conditions. This problem becomes even more challenging with a varying standoff distance. To address this problem a new compound operator named GWLH that fuses three local descriptors - Histogram of Gradients (HOG), Local Binary Patterns (LBP) and Weber Local Descriptors (WLD) - applied to the outputs of Gabor filters is proposed. The local operators encode both magnitude and phase information. When applied to periocular regions, GWLH outperforms other compound operators that recently appeared in the literature. During performance evaluation LBP, Gabor filters, HOG, and a fusion of HOG and LBP establish a baseline for the performance comparison, while other compound operators such as Gabor followed by HOG and LBP as well as Gabor followed by WLD, LBP and GLBP present the state-of-the-art. The active IR band is presented by short-wave infrared (SWIR) and near-infrared (NIR) and passive IR is presented by mid-wave infrared (MWIR) and long-wave infrared (LWIR). In addition to varying spectrum, we also vary the standoff distance of SWIR and NIR probes. In all but one case of the combination of spectrum and range, GWLH outperforms all the other operators. A sharpness metric is introduced to measure the quality of heterogeneous periocular images and to emphasize the need in development of image enhancement approaches for heterogeneous periocular biometrics. Based on the statistics of the sharpness metric, the performance difference between compound and single operators is increasing proportionally with increasing sharpness metric values.
2010 Fourth IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS), 2010
Publications Chair Patrick J. Flynn, University of Notre Dame, USA ... Mohamed Abdel-Mottaleb, Un... more Publications Chair Patrick J. Flynn, University of Notre Dame, USA ... Mohamed Abdel-Mottaleb, University of Miami, USA George Bebis, University of Nevada – Reno, USA Olga Bellon, Universidade Federal do Parana, Brazil Ross Beveridge, Colorado State University, USA Vijyaykumar Bhagavatula, Carnegie-Mellon University, USA Bir Bhanu, University of California – Riverside, USA Wageeh Boles, Queensland University of Technology, Australia Julien Bringer, Morpho, France Mark Burge, MITRE, USA Patrizio Campisi, Universita degli ...
Uploads
Papers by Natalia Schmid