Abstract:
Support vector machine (SVM) is one of the most well-known machine learning algorithms highly adept at handling both linear and non-linear binary classification problems. Linear separation margin deals easily with linearly separable data but is unfortunate to handle non-linear data. SVM with various kernel functions has great success in non-linear binary classification tasks, which map data into a higher plane to perform the classification. Unfortunately, it has been proved difficult and cumbersome to explicitly define and test these kernels, making them computationally expensive. Recently, some quadratic separation SVM models were introduced which uses quadratic surfaces for non-linear binary separations. In this study, a kernel-free quadratic SVM is proposed by the integration of a novel Q-margin for binary classification. The properties and the existence of a unique solution of the Q-margin have been discussed. Furthermore, the performance analyses on nine public data sets are done. The numerical results demonstrate the effectiveness and significance of the proposed model. It is seen that the proposed model performs 2-3% better on average when compared with well-known SVM models with or without kernels and other state-of-the-art classifiers.