DML
DML Sharif University of Technology
Efficient Iterative Semi-Supervised Classification on Manifold
  Dec   2011       Semi-supervised learning Convergence analysis
M. Farajtabar , H.R. Rabiee , A. Shaban and A. Soltani-Farani
Semi-Supervised Learning (SSL) has become a topic of recent research that effectively addresses the prob- lem of limited labeled data. Many SSL methods have been developed based on the manifold assumption, among them, the Local and Global Consistency (LGC) is a popular method. The problem with most of these algorithms, and in particular with LGC, is the fact that their naive implementations do not scale well to the size of data. Time and memory limitations are the major problems faced in large-scale problems. In this paper, we provide theoretical bounds on gradient descent, and to overcome the aforementioned problems, a new approximate Newton’s method is proposed. Moreover, convergence analysis and theoretical bounds for time complexity of the proposed method is provided. We claim that the number of iterations in the proposed methods, logarithmically depends on the number of data, which is a considerable improvement compared to the naive implementations. Experimental results on real world datasets confirm superiority of the proposed methods over LGC’s default iterative implementation and the state of the art factorization method.
Type
Workshop
Workshop
International Conference on Data Mining Workshops
Publisher
IEEE
Pages
228-235