Abstract:
Support Vector Machine (SVM), is a popular and efficient classification algorithm in machine learning (ML) paradigm. However, the kernel-based dependency of the SVM algorithm requires a long time to compute the support vectors for non-linear datasets. To remove the kernel, several types of functions are used with SVM. In earlier attempts, addition of kernel free approach in SVM caused major problems like repetitive feature space and long run time complexity. Therefore, a non-linear function is introduced in this study, namely i th degree polynomial (D i P). This function can directly identify non-linear features in a dataset. A kernel-free SVM model is proposed using D i P function named as D i PSVM in this paper. In D i PSVM, the input feature space is first taken into a new higher order feature space by using the multi-variable Taylor’s expansion of the input features up to i th order to evaluate all the non-linear correlations among the input features. This method is implemented to check which specific ordered polynomial can increase the accuracy of the kernel free SVM model. Finally, sequential minimal optimization (SMO) is used for fast convergence to reduce the superiority of D i PSVM, which is demonstrated over ten benchmark categorical and continuous datasets obtained from UCI machine learning repository. Results revealed that D i PSVM gives better accuracy and faster convergence than kernel-based SVM algorithms.