Drafts by Romain Cosentino
With the newly available deep learning techniques, new class of problems can be tackled. This is ... more With the newly available deep learning techniques, new class of problems can be tackled. This is for example the case when dealing with classification or regression problem with an input space of infinite dimension. The mapping input space to feature space can now be done using supervised techniques learning cascades of transformations in order to have a sparse and meaningful representation of the data in this new feature space. The limitation comes from the need to have supervised problems. This drawback has recently been pushed back by using a non-human teacher to complete an unsupervised task without involving human expertise for data labeling thanks to Reinforcement Learning. The first application was done by Google DeepMind by learning how to play Atari 2600 games. We will describe the theoretical background as well as state-of-the-art approaches of the problem to then develop our architecture bringing new insights and improvements.
Other by Romain Cosentino
Papers by Romain Cosentino
A popular generative model to achieve unsupervised clustering is the so-called Gaussian Mixture M... more A popular generative model to achieve unsupervised clustering is the so-called Gaussian Mixture Model. It assumes that each data point is generated via a particular parametric model. In order to fit the model parameters to a given data set, one commonly uses EM algorithm which iteratively estimates the parameters. The main drawback of GMM concerns the covariance matrices which have d(d − 1)/2 values to be estimated, where d denotes the dimension of the data. The estimation of the covariance matrices for high dimensional data will thus require a large amount of data to prevent over-fitting. In this project, we will study two different methods facing this issue. The first one is by using a direct regularization on the covariance matrix in order to make it sparse. The second method uses another kind of generative model, the mixture of probabilistic principal components. In fact, the probabilistic formulation of the Principal Components Analysis enables one to directly apply dimensionality reduction while being able to derive the model likelihood. Therefore the probabilistic interpretation will allow to derive a generative model which reduces the dimension of the data awhile dealing with the probabilistic tools such as Bayesian inference, likelihood estimation.
Talks by Romain Cosentino
Probabilistic Graphical Models A popular generative model to achieve clustering is the so called ... more Probabilistic Graphical Models A popular generative model to achieve clustering is the so called Gaussian Mixture Model which has too many free parameters for high dimensional dataset. The two main solutions proposed in this project are the sGMM adding a spar-sity regularisation term to the likelihood and the MPPCA which uses a probabilistic view of PCA.
Uploads
Drafts by Romain Cosentino
Other by Romain Cosentino
Papers by Romain Cosentino
Talks by Romain Cosentino