Papers by Peyman Milanfar
We present a novel human action recognition method based on space-time locally adaptive regressio... more We present a novel human action recognition method based on space-time locally adaptive regression kernels and the matrix cosine similarity measure. The proposed method operates using a single example (e.g., short video clip) of an action of interest to find similar matches. It does not require prior knowledge (learning) about actions being sought; and does not require foreground/background segmentation, or any motion estimation or tracking. Our method is based on the computation of the so-called local steering kernels as space-time descriptors from a query video, which measure the likeness of a voxel to its surroundings. Salient features are extracted from said descriptors and compared against analogous features from the target video. This comparison is done using a matrix generalization of the cosine similarity measure. The algorithm yields a scalar resemblance volume with each voxel here, indicating the likelihood of similarity between the query video and all cubes in the target ...
Frontiers in Optics 2009/Laser Science XXV/Fall 2009 OSA Optics & Photonics Technical Digest, 2009
ABSTRACT I present a nonparametric framework for locally-adaptive signal processing and analysis.... more ABSTRACT I present a nonparametric framework for locally-adaptive signal processing and analysis. Without making strong assumptions about noise/signal models, the framework is applicable to many problems including denoising, upscaling, and object detection in images and video.
Computational Imaging IV, 2006
ABSTRACT The statistics of natural images play an important role in many image processing tasks. ... more ABSTRACT The statistics of natural images play an important role in many image processing tasks. In particular, statis- tical assumptions about difierences between neighboring pixel values are used extensively in the form of prior information for many diverse applications. The most common assumption is that these pixel difierence values can be described be either a Laplace or Generalized Gaussian distribution. The statistical validity of these two assumptions is investigated formally in this paper by means of Chi-squared goodness of flt tests. The Laplace and Generalized Gaussian distributions are seen to deviate from real images, with the main source of error being the large number of zero and close to zero nearby pixel difierence values. These values correspond to the relatively uniform areas of the image. A mixture distribution is proposed to retain the edge modeling ability of the Laplace or Generalized Gaussian distribution, and to improve the modeling of the efiects introduced by smooth image regions. The Chi-squared tests of flt indicate that the mixture distribution ofiers a signiflcant improvement in flt.
IEEE Signal Processing Magazine, 2000
ABSTRACT In this article, the author presents a practical and accessible framework to understand ... more ABSTRACT In this article, the author presents a practical and accessible framework to understand some of the basic underpinnings of these methods, with the intention of leading the reader to a broad understanding of how they interrelate. The author also illustrates connections between these techniques and more classical (empirical) Bayesian approaches. The proposed framework is used to arrive at new insights and methods, both practical and theoretical. In particular, several novel optimality properties of algorithms in wide use such as block-matching and three-dimensional (3-D) filtering (BM3D), and methods for their iterative improvement (or nonexistence thereof) are discussed. A general approach is laid out to enable the performance analysis and subsequent improvement of many existing filtering algorithms. While much of the material discussed is applicable to the wider class of linear degradation models beyond noise (e.g., blur,) to keep matters focused, we consider the problem of denoising here.
... specific areas: 1. We exploit recent advances in the physical design of fast optical systems ... more ... specific areas: 1. We exploit recent advances in the physical design of fast optical systems which enable active imaging and ranging with ballistic light. In this modality, fast bursts of optical energy are transmitted into a ...
Standard Form 298 (Rev. 8/98) REPORT DOCUMENTATION PAGE Prescribed by ANSI Std. Z39.18 Form Appro... more Standard Form 298 (Rev. 8/98) REPORT DOCUMENTATION PAGE Prescribed by ANSI Std. Z39.18 Form Approved OMB No. 0704-0188
A high-resolution ground penetrating radar system was designed to help define the optimal radar p... more A high-resolution ground penetrating radar system was designed to help define the optimal radar parameters needed for the efficient standoff detection of buried and surface- laid antitank mines. The design requirements call for a forward-looking GPR capable of detecting antitank miens in a 5 to 8 meter wide swath, 7 to 60 meters in front of a mobile platform. The
Theoretical and practical limitations usually constrain the achievable resolution of any imaging ... more Theoretical and practical limitations usually constrain the achievable resolution of any imaging device. Super-Resolution (SR) methods are developed through the years to go beyond this limit by acquiring and fusing several low-resolution (LR) images of the same scene, producing a high-resolution (HR) image. The early works on SR, although occasionally mathematically optimal for particular models of data and noise, produced poor results when applied to real images. In this paper, we discuss two of the main issues related to designing a practical SR system, namely reconstruction accuracy and computational efficiency. Reconstruction accuracy refers to the problem of designing a robust SR method applicable to images from different imaging systems. We study a general framework for optimal reconstruction of images from grayscale, color, or color filtered (CFA) cameras. The performance of our proposed method is boosted by using powerful priors and is robust to both measurement (e.g. CCD re...
Proceedings 2000 International Conference on Image Processing (Cat. No.00CH37101), 2000
Page 1. AN EFFICIENT WAVELET-BASED ALGORITHM FOR IMAGE SUPERRESOLUTIONNhat Nguyen Scientific Comp... more Page 1. AN EFFICIENT WAVELET-BASED ALGORITHM FOR IMAGE SUPERRESOLUTIONNhat Nguyen Scientific Computing and Computational Mathematics Program Gates Bldg. 2B Stanford, CA 94305 [email protected] ...
This paper discusses the problem of recovering a planar polygon from its measured complex moments... more This paper discusses the problem of recovering a planar polygon from its measured complex moments. These moments correspond to an indicator function defined over the polygon's support. Previous work on this problem gave necessary and sufficient conditions for such successful recovery process and focused mainly on the case of exact measurements being given. In this paper, we extend these results
IEEE Transactions on Image Processing, 1996
Abstract, Super-resolution reconstruction produces one or a set of high-resolution images from a ... more Abstract, Super-resolution reconstruction produces one or a set of high-resolution images from a set of low-resolution images. In the last two decades, a variety of super-resolution methods have been proposed. These methods are usually very sensitive to their assumed model of data and noise, which limits their utility. This paper reviews some of these methods and addresses their short-comings. We
Scientific Reports, 2015
The increasing interest in nanoscience in many research fields like physics, chemistry, and biolo... more The increasing interest in nanoscience in many research fields like physics, chemistry, and biology, including the environmental fate of the produced nano-objects, requires instrumental improvements to address the sub-micrometric analysis challenges. The originality of our approach is to use both the super-resolution concept and multivariate curve resolution (MCR-ALS) algorithm in confocal Raman imaging to surmount its instrumental limits and to characterize chemical components of atmospheric aerosols at the level of the individual particles. We demonstrate the possibility to go beyond the diffraction limit with this algorithmic approach. Indeed, the spatial resolution is improved by 65% to achieve 200 nm for the considered far-field spectrophotometer. A multivariate curve resolution method is then coupled with super-resolution in order to explore the heterogeneous structure of submicron particles for describing physical and chemical processes that may occur in the atmosphere. The proposed methodology provides new tools for sub-micron characterization of heterogeneous samples using far-field (i.e. conventional) Raman imaging spectrometer.
2009 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2009
We present a novel bottom-up saliency detection algorithm. Our method computes so-called local re... more We present a novel bottom-up saliency detection algorithm. Our method computes so-called local regression kernels (i.e., local features) from the given image, which measure the likeness of a pixel to its surroundings. Visual saliency is then computed using the said "selfresemblance" measure. The framework results in a saliency map where each pixel indicates the statistical likelihood of saliency of a feature matrix given its surrounding feature matrices. As a similarity measure, matrix cosine similarity (a generalization of cosine similarity) is employed. State of the art performance is demonstrated on commonly used human eye fixation data and some psychological patterns.
Lecture Notes in Computer Science, 2013
2009 IEEE International Symposium on Biomedical Imaging: From Nano to Macro, 2009
We present a novel approach to change detection between two brain MRI scans (reference and target... more We present a novel approach to change detection between two brain MRI scans (reference and target.) The proposed method uses a single modality to find subtle changes; and does not require prior knowledge (learning) of the type of changes to be sought. The method is based on the computation of a local kernel from the reference image, which measures the likeness of a pixel to its surroundings. This kernel is then used as a feature and compared against analogous features from the target image. This comparison is made using cosine similarity. The overall algorithm yields a scalar dissimilarity map (DM), indicating the local statistical likelihood of dissimilarity between the reference and target images. DM values exceeding a threshold then identify meaningful and relevant changes. The proposed method is robust to various challenging conditions including unequal signal strength.
2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), 2011
2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops, 2014
Pattern Recognition Letters, 1996
... Article Outline. References. ELSEVIER Pattern Recognition Letters 17 (1996) 209210 Pattern ... more ... Article Outline. References. ELSEVIER Pattern Recognition Letters 17 (1996) 209210 Pattern Recognition Letters On the Hough transform of a polygon Peyman Milanfar SRl lnternational, 333 Ravenswood Ave., MS 40944, Menlo Park, CA 94025, USA Received 5 October 1995 ...
2010 IEEE International Conference on Acoustics, Speech and Signal Processing, 2010
... 5. Top: plots of PSNR,SSIM, and Q metric versus it-eration number in ISKR denoising, Middle: ... more ... 5. Top: plots of PSNR,SSIM, and Q metric versus it-eration number in ISKR denoising, Middle: a noisy image (std=20), boundary map, selected anisotropic patches (red area), Bottom ... [12] HJSeo and P ... [13] LA Chan, SZ Der, and NM Nasarbadi, Automatic target detection, Ency ...
Uploads
Papers by Peyman Milanfar