Extreme learning machine (ELM) is a randomized algorithm which randomly generates the input weights and hidden nodes biases of single-hidden layer feed-forward neural networks (SLFNNs), and then determines the output weights analytically. Given the architecture of SLFNN, we can obtain different learning models by repeatedly training SLFNNs with ELM. The paper proposes an approach by integrating these learning models for data classification. Specifically, firstly several SLFNNs are trained by ELM. Secondly the trained SLFNNs are integrated by majority voting method. Finally the integrated model is used for data classification. We experimentally compared the proposed approach with ELM and ensemble ELM (EELM) on 10 data sets. Experimental results show that the proposed approach outperforms ELM and EELM.