5
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Variational Information Bottleneck for Unsupervised Clustering: Deep Gaussian Mixture Embedding

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          In this paper, we develop an unsupervised generative clustering framework that combines the variational information bottleneck and the Gaussian mixture model. Specifically, in our approach, we use the variational information bottleneck method and model the latent space as a mixture of Gaussians. We derive a bound on the cost function of our model that generalizes the Evidence Lower Bound (ELBO) and provide a variational inference type algorithm that allows computing it. In the algorithm, the coders’ mappings are parametrized using neural networks, and the bound is approximated by Markov sampling and optimized with stochastic gradient descent. Numerical results on real datasets are provided to support the efficiency of our method.

          Related collections

          Most cited references33

          • Record: found
          • Abstract: not found
          • Article: not found

          Gradient-based learning applied to document recognition

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Principal component analysis

              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Algorithm AS 136: A K-Means Clustering Algorithm

                Bookmark

                Author and article information

                Journal
                Entropy (Basel)
                Entropy (Basel)
                entropy
                Entropy
                MDPI
                1099-4300
                13 February 2020
                February 2020
                : 22
                : 2
                : 213
                Affiliations
                [1 ]Laboratoire d’informatique Gaspard-Monge, Université Paris-Est, 77454 Champs-sur-Marne, France
                [2 ]Mathematical and Algorithmic Sciences Lab, Paris Research Center, Huawei Technologies, 92100 Boulogne-Billancourt, France
                Author notes
                Author information
                https://orcid.org/0000-0003-1835-6964
                https://orcid.org/0000-0003-2023-9476
                Article
                entropy-22-00213
                10.3390/e22020213
                7516645
                33285988
                7ba1f086-69e4-4884-9192-336438c2f539
                © 2020 by the authors.

                Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license ( http://creativecommons.org/licenses/by/4.0/).

                History
                : 03 December 2019
                : 09 February 2020
                Categories
                Article

                clustering,unsupervised learning,gaussian mixture model,information bottleneck

                Comments

                Comment on this article

                scite_
                0
                0
                0
                0
                Smart Citations
                0
                0
                0
                0
                Citing PublicationsSupportingMentioningContrasting
                View Citations

                See how this article has been cited at scite.ai

                scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.

                Similar content116

                Cited by3

                Most referenced authors634