Effect of Data Scaling Methods on Machine Learning Algorithms and Model Performance

Ahsan, Md and Mahmud, M. and Saha, Pritom and Gupta, Kishor and Siddique, Zahed (2021) Effect of Data Scaling Methods on Machine Learning Algorithms and Model Performance. Technologies, 9 (3). p. 52. ISSN 2227-7080

[thumbnail of technologies-09-00052-v3.pdf] Text
technologies-09-00052-v3.pdf - Published Version

Download (488kB)

Abstract

Heart disease, one of the main reasons behind the high mortality rate around the world, requires a sophisticated and expensive diagnosis process. In the recent past, much literature has demonstrated machine learning approaches as an opportunity to efficiently diagnose heart disease patients. However, challenges associated with datasets such as missing data, inconsistent data, and mixed data (containing inconsistent missing data both as numerical and categorical) are often obstacles in medical diagnosis. This inconsistency led to a higher probability of misprediction and a misled result. Data preprocessing steps like feature reduction, data conversion, and data scaling are employed to form a standard dataset—such measures play a crucial role in reducing inaccuracy in final prediction. This paper aims to evaluate eleven machine learning (ML) algorithms—Logistic Regression (LR), Linear Discriminant Analysis (LDA), K-Nearest Neighbors (KNN), Classification and Regression Trees (CART), Naive Bayes (NB), Support Vector Machine (SVM), XGBoost (XGB), Random Forest Classifier (RF), Gradient Boost (GB), AdaBoost (AB), Extra Tree Classifier (ET)—and six different data scaling methods—Normalization (NR), Standscale (SS), MinMax (MM), MaxAbs (MA), Robust Scaler (RS), and Quantile Transformer (QT) on a dataset comprising of information of patients with heart disease. The result shows that CART, along with RS or QT, outperforms all other ML algorithms with 100% accuracy, 100% precision, 99% recall, and 100% F1 score. The study outcomes demonstrate that the model’s performance varies depending on the data scaling method.

Item Type: Article
Subjects: Library Keep > Multidisciplinary
Depositing User: Unnamed user with email support@librarykeep.com
Date Deposited: 31 Mar 2023 07:58
Last Modified: 23 Apr 2024 12:09
URI: http://archive.jibiology.com/id/eprint/421

Actions (login required)

View Item
View Item