Xin LiuTsuyoshi MurataKyoung Sook KimChatchawan KotarasuChenyi ZhuangTokyo Institute of TechnologyMahidol UniversityNational Institute of Advanced Industrial Science and Technology2020-01-272020-01-272019-01-30WSDM 2019 - Proceedings of the 12th ACM International Conference on Web Search and Data Mining. (2019), 375-3832-s2.0-85061741480https://repository.li.mahidol.ac.th/handle/123456789/50652© 2019 Association for Computing Machinery. We propose a general view that demonstrates the relationship between network embedding approaches and matrix factorization. Unlike previous works that present the equivalence for the approaches from a skip-gram model perspective, we provide a more fundamental connection from an optimization (objective function) perspective. We demonstrate that matrix factorization is equivalent to optimizing two objectives: one is for bringing together the embeddings of similar nodes; the other is for separating the embeddings of distant nodes. The matrix to be factorized has a general form: S−β·1. The elements of S indicate pairwise node similarities. They can be based on any user-defined similarity/distance measure or learned from random walks on networks. The shift number β is related to a parameter that balances the two objectives. More importantly, the resulting embeddings are sensitive to β and we can improve the embeddings by tuning β. Experiments show that matrix factorization based on a new proposed similarity measure and β-tuning strategy significantly outperforms existing matrix factorization approaches on a range of benchmark networks.Mahidol UniversityComputer ScienceA general view for network embedding as matrix factorizationConference PaperSCOPUS10.1145/3289600.3291029