Papers by Charles Kervrann
We study an image segmentation energy, which minimizer can be determined. The approach estimates ... more We study an image segmentation energy, which minimizer can be determined. The approach estimates the unknown number of objects and draws object boundaries by selecting the \best" level lines computed from level sets of the image. As a consequence, no energy minimization methods is necessary, yielding to a fast and non-iterative segmentation algorithm. Finally, anisotropic di usion is used to smooth level lines in noisy images.
Change detection between two images is challenging and needed in a wide variety of imaging applic... more Change detection between two images is challenging and needed in a wide variety of imaging applications. Several approaches have been yet developed, especially methods based on difference image. In this paper, we propose an original patch-based Markov modeling framework to detect spatial irregularities in the difference image with low false alarm rates. Experimental results show that the proposed approach performs well for change detection, especially for images with low signal-to-noise ratios.
This work tackles the issue of noise removal from images, focusing on the well-known DCT image de... more This work tackles the issue of noise removal from images, focusing on the well-known DCT image denoising algorithm. The latter, stemming from signal processing, has been well studied over the years. Though very simple, it is still used in crucial parts of state-of-the-art ”traditional” denoising algorithms such as BM3D. Since a few years however, deep convolutional neural networks (CNN) have outperformed their traditional counterparts, making signal processing methods less attractive. In this paper, we demonstrate that a DCT denoiser can be seen as a shallow CNN and thereby its original linear transform can be tuned through gradient descent in a supervised manner, improving considerably its performance. This gives birth to a fully interpretable CNN called DCT2net. To deal with remaining artifacts induced by DCT2net, an original hybrid solution between DCT and DCT2net is proposed combining the best that these two methods can offer; DCT2net is selected to process non-stationary image ...
A novel adaptive smoothing approach is proposed for noise removal and discontinuities preservatio... more A novel adaptive smoothing approach is proposed for noise removal and discontinuities preservation. The method is based on locally constant modeling of the image, with an adaptive choice of a window around each pixel in which the applied model fits the data well. The filtering technique associates with each pixel the weighted sum of data points within the window. We describe a statistical method for choosing the optimal window size with an adaptive choice of weights for every pair of pixels in the window. The proposed technique is data-driven and fully adaptive. Simulation results show that our algorithm yields promising smoothing results on a variety of real images.
Nature Methods
In the version of this Article initially published, there was an error in Fig. 2b. The image labe... more In the version of this Article initially published, there was an error in Fig. 2b. The image labeled "Segmentation target" was a duplicate of Fig. 2a; the image has been replaced with the correct version. In the Fig. 4 caption for panels "b,c, Score maps…, " the text "(25 Å)" has been removed from the end of the sentence. For the final table in the online Methods, under "Evaluation, " the data are unchanged but have been reorganized for clarity. Finally, the two callouts to "Fig. 4" in Extended Data Fig. 5 caption should instead have referred to "Extended Data Fig. 4" and have now been corrected. The changes have been made to the online version of the article.
BIO-PROTOCOL
The α-β tubulin heterodimer undergoes subtle conformational changes during microtubule assembly. ... more The α-β tubulin heterodimer undergoes subtle conformational changes during microtubule assembly. These can be modulated by external factors, whose effects on microtubule structure can be characterized on 2D views obtained by cryo-electron microscopy. Analysis of microtubule images is facilitated if they are straight enough to interpret and filter their image Fourier transform, which provide useful information concerning the arrangement of tubulin molecules inside the microtubule lattice. Here, we describe the use of the TubuleJ software to straighten microtubules and determine their lattice parameters. Basic 3D reconstructions can be performed to evaluate the relevance of these parameters. This approach can be used to analyze the effects of nucleotide analogues, drugs or MAPs on microtubule structure, or to select microtubule images prior to high-resolution 3D reconstructions.
2017 IEEE International Conference on Computer Vision Workshops (ICCVW), Oct 1, 2017
Local and global approaches can be identified as the two main classes of optical flow estimation ... more Local and global approaches can be identified as the two main classes of optical flow estimation methods. In this paper, we propose a framework to combine the advantages of these two principles, namely robustness to noise of the local approach and discontinuity preservation of the global approach. This is particularly crucial in biological imaging, where the noise produced by microscopes is one of the main issues for optical flow estimation. The idea is to adapt spatially the local support of the local parametric constraint in the combined local-global model [6]. To this end, we jointly estimate the motion field and the parameters of the spatial support. We apply our approach to the case of Gaussian filtering, and we derive efficient minimization schemes for usual data terms. The estimation of a spatially varying standard deviation map prevents from the smoothing of motion discontinuities, while ensuring robustness to noise. We validate our method for a standard model and demonstrate how a baseline approach with pixel-wise data term can be improved when integrated in our framework. The method is evaluated on the Middlebury benchmark with ground truth and on real fluorescence microscopy data.
IEEE Transactions on Image Processing
One of the major challenges in multiple particle tracking is the capture of extremely heterogeneo... more One of the major challenges in multiple particle tracking is the capture of extremely heterogeneous movements of objects in crowded scenes. The presence of numerous assignment candidates in the expected range of particle motion makes the tracking ambiguous and induces false positives. Lowering the ambiguity by reducing the search range, on the other hand, is not an option, as this would increase the rate of false negatives.
European Microscopy Congress 2016: Proceedings, 2016
In this study, we have addressed two important issues in cryo electron tomography images: the low... more In this study, we have addressed two important issues in cryo electron tomography images: the low signal-to-noise ratio and the presence of a missing wedge (MW) of information in the spectral domain. Indeed, according to the Fourier slice theorem, limited angle tomography results into an incomplete sampling of the Fourier domain. Therefore, the Fourier domain is separated into two regions: the known spectrum (KS) and the unknown spectrum, the latter having the shape of a missing wedge (see Figure). The proposed method tackles both issues jointly, by iteratively applying a denoising algorithm in order to fill up the MW, and proceeds as follows: [1] Excitation step: Add noise into the MW [2] Denoising step: Apply a patch-based denoising algorithm [3] Repeat steps 1 and 2, by keeping KS constant through the iterations The excitation step is used to randomly initialize the coefficients of the MW, whereas the denoising step acts as a spatial regularization. The employed denoising algorithm, which exploits the self-similarity of the image, filters out coefficient values which are dissimilar to KS, thereby keeping similar ones. By iterating these steps, we are able to diffuse the information contained in KS into the MW. An application example on experimental data can be seen on the Figure, which shows the data in both spectral and spatial domain. The data contains a spherical gold particle, deformed by MW induced artifacts: elongation of the object, side- and ray-artifacts. From the residue image it can be seen that noise and MW artifacts have been reduced, while preserving the details of the image. Experiments are being performed to verify if particle detection and alignment are enhanced by using the method as a pre-processing step.
Journal of Mathematical Imaging and Vision, 2016
We propose a variational aggregation method for optical flow estimation. It consists of a two-ste... more We propose a variational aggregation method for optical flow estimation. It consists of a two-step framework, first estimating a collection of parametric motion models to generate motion candidates, and then reconstructing a global dense motion field. The aggregation step is designed as a motion reconstruction problem from spatially-varying sets of motion candidates given by parametric motion models. Our method is designed to capture large displacements in a variational framework without requiring any coarse-to-fine strategy. We handle occlusion with a motion inpainting approach in the candidates computation step. By performing parametric motion estimation, we combine the robustness to noise of local parametric methods with the accuracy yielded by global regularization. We demonstrate the performance of our aggregation approach by comparing it to standard variational methods and a discrete aggregation approach on the Middlebury and MPI Sintel datasets.
IEEE Journal of Selected Topics in Signal Processing, 2016
Microscopy imaging, including fluorescence microscopy and electron microscopy, has taken a promin... more Microscopy imaging, including fluorescence microscopy and electron microscopy, has taken a prominent role in life science research and medicine due to its ability to investigate the 3D interior of live cells and organisms. A long-term research in bio-imaging at the sub-cellular and cellular scales consists then in inferring the relationships between the dynamics of macromolecules and their functions. In this area, image processing and analysis methods are now essential to understand the dynamic organization of groups of interacting molecules inside molecular machineries and to address issues in fundamental biology driven by advances in molecular biology, optics and technology. In this paper, we present recent advances in fluorescence and electron microscopy and we focus on dedicated image processing and analysis methods required to quantify phenotypes for a limited number but typical studies in cell imaging.
IEEE Journal of Selected Topics in Signal Processing, 2015
A quantitative analysis of the dynamic contents in fluorescence time-lapse microscopy is crucial ... more A quantitative analysis of the dynamic contents in fluorescence time-lapse microscopy is crucial to decipher the molecular mechanisms involved in cell functions. In this paper, we propose an original traffic analysis approach based on the counting of particles from frame to frame. The suggested method lies between individual object tracking and dense motion estimation (i.e., optical flow). Instead of tracking each moving particle, we estimate fluxes of particles between predefined and adjacent regions. The problem is formulated as the minimization of a global cost function and the approach allows us to process image sequences with a high number of particles and a high rate of particle appearances and disappearances. We propose to study the influence of object density, image partition scale, motion amplitude, and particle appearances/disappearances in a large variety of simulations. The potential of the method is finally demonstrated on real image sequences showing GFP-tagged Rab6 trafficking in confocal microscopy.
Journal of the Optical Society of America A, 2015
Fluorescence lifetime is usually defined as the average nanosecond-scale delay between excitation... more Fluorescence lifetime is usually defined as the average nanosecond-scale delay between excitation and emission of fluorescence. It has been established that lifetime measurement yields numerous indications on cellular processes such as inter-protein and intra-protein mechanisms through fluorescent tagging and Förster resonance energy transfer (FRET). In this area, frequency domain fluorescence lifetime imaging microscopy (FD FLIM) is particularly well appropriate to probe a sample non-invasively and quantify these interactions in living cells. The aim is then to measure fluorescence lifetime in the sample at each location in space from fluorescence variations observed in a temporal sequence of images obtained by phase modulation of the detection signal. This leads to a sensitivity of lifetime determination to other sources of fluorescence variations such as intracellular motion. In this paper, we propose a robust statistical method for lifetime estimation on both background and small moving structures with a focus on intracellular vesicle trafficking.
Patch-based methods have been widely used for noise reduction in recent years. In this paper, we ... more Patch-based methods have been widely used for noise reduction in recent years. In this paper, we propose a general statistical aggregation method which combines image patches denoised with several commonly-used algorithms. We show that weakly denoised versions of the input image obtained with standard methods, can serve to compute an efficient patch-based aggregated estimator. In our approach, we evaluate the Stein's Unbiased Risk Estimator (SURE) of each denoised candidate image patch and use this information to compute the exponential weighted aggregation (EWA) estimator. The aggregation method is flexible enough to combine any standard denoising algorithm and has an interpretation with Gibbs distribution. The denoising algorithm (PEWA) is based on a MCMC sampling and is able to produce results that are comparable to the current state-of-the-art.
Mitochondria are dynamic organelles playing essential metabolic and signaling functions in cells.... more Mitochondria are dynamic organelles playing essential metabolic and signaling functions in cells. Their ultrastructure has largely been investigated with electron microscopy (EM) techniques, which provided a wide range of information on how mitochondria acquire a tissue-specific shape, how they change during development, and how they are altered in disease conditions. However, quantifying protein-protein proximities using EM is extremely challenging. Super-resolution microscopy techniques as direct stochastic optical reconstruction microscopy (dSTORM) now provide a fluorescent-based alternative to EM with a higher quantitative throughput. Recently, super-resolution microscopy approaches including dSTORM led to valuable advances in our knowledge of mitochondrial ultrastructure, and in linking it with new insights in organelle functions. Nevertheless, dSTORM is currently used to image integral mitochondrial proteins only, and there is little or no information on proteins transiently p...
Uploads
Papers by Charles Kervrann