Frequency : 12 issues per year
Subject : Computer Applications and Technology
ISSN : 2319–8656 (Online)
IJCATR Volume 8 Issue 9
Campus Placement Analyzer: Using Supervised Machine Learning Algorithms
Shubham Khandale, Sachin Bhoite, Dr. Ajit More
10.7753/IJCATR0809.1004
keywords : Pre-processing, Feature Selection, Domain expertise, Outliers, Bagging, Boosting, SVM, KNN, Logistics
The main aim of every academia enthusiast is placement in a reputed MNC’s and even the reputation and every year admission of Institute depends upon placement that it provides to their students. So, any system that will predict the placements of the students will be a positive impact on an institute and increase strength and decreases some workload of any institute’s training and placement office (TPO). With the help of Machine Learning techniques, the knowledge can be extracted from past placed students and placement of upcoming students can be predicted. Data used for training is taken from the same institute for which the placement prediction is done. Suitable data pre-processing methods are applied along with the features selections. Some Domain expertise is used for pre-processing as well as for outliers that grab in the dataset. We have used various Machine Learning Algorithms like Logistic, SVM, KNN, Decision Tree, Random Forest and advance techniques like Bagging, Boosting and Voting Classifier and achieved 78% in XGBoost and 78% in AdaBoost Classifier.
@artical{s892019ijcatr08091004,
Title = "Campus Placement Analyzer: Using Supervised Machine Learning Algorithms",
Journal ="International Journal of Computer Applications Technology and Research(IJCATR)",
Volume = "8",
Issue ="9",
Pages ="358 - 362",
Year = "2019",
Authors ="Shubham Khandale, Sachin Bhoite, Dr. Ajit More "}
Base Classifier as Decision Tree, over that we have used AdaBoost Classifier and over that we have used Bagging classifier
Voting classifier(Hard/Soft) are used for prediction
Boosting classifier is used like Gradient and XGBoost classifier
Lasso, Ridge and feature Importance of Random Forest are used for feature selection
31 features i.e. from 10th to 3rd year of engineering academics are used for training the model