SVM
Support Vector Machine (SVM) is a supervised learning model used for classification, regression, and outlier detection, though most commonly for classification. SVMs seek a decision boundary, a hyperplane, that best separates data points of different classes by maximizing the margin—the distance between the hyperplane and the nearest points from each class. In the linear case, the hyperplane is defined by w·x + b = 0, and the sign of w·x + b determines the predicted class. To allow some misclassifications, a soft margin introduces slack variables and a penalty parameter C that controls the trade-off between margin width and misclassification.
The learning problem can be formulated as a convex optimization task. In the primal form, minimize (1/2)||w||^2
To handle non-linear boundaries, SVMs use the kernel trick, mapping inputs into a high-dimensional feature space