Improving ADA-boost as a Popular Ensemble in Classification Problems
M. Sravan Kumar Reddy1, K.E. Naresh Kumar2, Dharmendra Singh Rajput3

1M. Sravan Kumar Reddy, Assistant Professor, Department of CSE RGMCET, Nandyal, Kurnool, India.

2K.E. Naresh Kumar, Assistant Professor, Department of CSE RGMCET, Nandyal, Kurnool, India.

3Dr. Dharmendra Singh Rajput, Associate Professor, SITE, VIT University, Vellore, India.

Manuscript received on 02 July 2019 | Revised Manuscript received on 16 July 2019 | Manuscript Published on 23 August 2019 | PP: 241-243 | Volume-8 Issue-9S3 August 2019 | Retrieval Number: I30430789S319/2019©BEIESP | DOI: 10.35940/ijitee.I3043.0789S319

Open Access | Editorial and Publishing Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open-access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: In Data Mining, several classification algorithms are used to perform classification based on single learner but classification accuracy is not in an effective manner. To increase the accuracy of the classification then multiple learners are combined to get better results. The multiple learners are trained and combined into an ensemble. The ensemble can increase generalization ability and robustness [3]. Based on the advantage of ensemble, the ensemble classification is a major concern in research directions of machine learning. Another importance of ensemble is that it is much stronger than single base learner to produce accurate hypothesis. The ensembles are divided into homogeneous or heterogeneous, dependent or independent ensembles. The dependent ensemble methods like boosting and AdaBoost algorithms are promisingly provide an accurate hypothesis. Finally, AdaBoost can be a better classifier ensemble to generate accurate results.

Keywords: ADABoost, Ensemble Classification, Classifier Ensemble, Accuracy, Bagging
Scope of the Article: Classification