Artificial Intelligence 4 min read

Understanding Support Vector Machines: Theory, Example, and Python Code

This article explains the fundamentals of Support Vector Machines, describes how they separate data with optimal hyperplanes, provides a 2‑D example with visualizations, and includes Python code using scikit‑learn to generate synthetic data, plot points, and illustrate possible decision boundaries.

Model Perspective
Model Perspective
Model Perspective
Understanding Support Vector Machines: Theory, Example, and Python Code

1 Support Vector Machine Model

In machine learning, a Support Vector Machine (SVM) is a supervised learning model used for classification and regression analysis. It defines a discriminant classifier by a separating hyperplane, outputting the optimal hyperplane that classifies new examples based on labeled training data.

The SVM represents samples as points in space and, through mapping, separates different classes with the widest possible margin. Besides linear classification, SVM can efficiently perform nonlinear classification by implicitly mapping inputs into a high‑dimensional feature space.

2 Example

Assume data points are represented on a two‑dimensional plane:

<code># importing scikit-learn with make_blobs
from sklearn.datasets import make_blobs
import matplotlib.pyplot as plt
# creating datasets X containing n_samples
# Y containing two classes
X, Y = make_blobs(n_samples=500, centers=2,
                  random_state=0, cluster_std=0.40)
import matplotlib.pyplot as plt
# plotting scatters
plt.scatter(X[:, 0], X[:, 1], c=Y, s=50, cmap='spring')
plt.show()
</code>

We can draw several lines as classifiers to separate the classes:

Intuitively, an SVM finds the optimal line or hyperplane that best separates two or more classes.

<code># creating linspace between -1 to 3.5
xfit = np.linspace(-1, 3.5)
# plotting scatter
plt.scatter(X[:, 0], X[:, 1], c=Y, s=50, cmap='spring')
# plot a line between the different sets of data
for m, b, d in [(1, 0.65, 0.33), (0.5, 1.6, 0.55), (-0.2, 2.9, 0.2)]:
    yfit = m * xfit + b
    plt.plot(xfit, yfit, '-k')
    plt.fill_between(xfit, yfit - d, yfit + d, edgecolor='none', color='#AAAAAA', alpha=0.4)
plt.xlim(-1, 3.5)
plt.show()
</code>

Reference

https://www.geeksforgeeks.org/classifying-data-using-support-vector-machinessvms-in-python/

machine learningPythonclassificationscikit-learnsupport vector machine
Model Perspective
Written by

Model Perspective

Insights, knowledge, and enjoyment from a mathematical modeling researcher and educator. Hosted by Haihua Wang, a modeling instructor and author of "Clever Use of Chat for Mathematical Modeling", "Modeling: The Mathematics of Thinking", "Mathematical Modeling Practice: A Hands‑On Guide to Competitions", and co‑author of "Mathematical Modeling: Teaching Design and Cases".

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.