Extreme Learning Machine (ELM) has been proposed as a new algorithm for training single hidden layer feed forward neural networks. The main merit of ELM lies in the fact that the input weights as well as hidden layer bias are randomly generated and thus the output weights can be obtained analytically, which can overcome the drawbacks incurred by gradient-based training algorithms such as local optima, improper learning rate and low learning speed. Based on the consistency property of data, which enforces similar samples to share similar properties, we propose a discriminative graph regularized Extreme Learning Machine (GELM) for further enhancing its classification performance in this paper. In the proposed GELM model, the label information of training samples are used to construct an adjacent graph and correspondingly the graph regularization term is formulated to constrain the output weights to learn similar outputs for samples from the same class. The proposed GELM model also has a closed form solution as the standard ELM and thus the output weights can be obtained efficiently. Experiments on several widely used face databases show that our proposed GELM can achieve much performance gain over standard ELM and regularized ELM. Moreover, GELM also performs well when compared with the state-of-the-art classification methods for face recognition.
All Science Journal Classification (ASJC) codes
- Computer Science Applications
- Cognitive Neuroscience
- Artificial Intelligence