Abstract:In recent years, the deep clustering approach based on variational autoencoders (VAE) has garnered significant attention due to its remarkable generative capabilities and clustering performance. However, existing VAE-based clustering algorithms typically revolve around dealing with the challenging Evidence Lower Bound (ELBO) to achieve clustering. Moreover, they often require prior knowledge of the class distribution, with most relying on Gaussian distributions or Gaussian mixture models. A deep clustering network based on Hyperspherical Variational Auto-Encoders (HVAE) is proposed, named DHVC (Deep Hyperspherical Variational Clustering Network). This approach substitutes the complex Evidence Lower Bound (ELBO) with the Kullback-Leibler Divergence of two posterior distributions and employs the von Mises-Fisher Mixture Model (vMFMM) distribution as an embedding in the latent space. Additionally, it maximizes the Extended Mutual Information (EMI) between latent representations and predicted cluster assignments for more discriminative and balanced allocations. The effectiveness of the proposed deep clustering method is validated through comparisons with state-of-the-art deep clustering techniques on benchmark datasets. The established model provides a novel approach for complex data feature learning and cluster analysis, extending the application of Bayesian inference methods in deep clustering.