Artificial Intelligence 12 min read

Master Classic Modeling with Python: LP, Graphs, Clustering, PCA & More

This article presents Python implementations of classic mathematical modeling techniques—including linear programming with PuLP, shortest‑path analysis using NetworkX, K‑means and hierarchical clustering, principal component analysis, frequent‑pattern mining with FP‑Growth, and linear regression and K‑nearest‑neighbors—providing code snippets, explanations, and visualizations to guide readers through each method.

Model Perspective
Model Perspective
Model Perspective
Master Classic Modeling with Python: LP, Graphs, Clustering, PCA & More

This article introduces classic mathematical modeling models and provides their Python implementations.

Linear Programming

Solving a linear programming problem using the PuLP library.

<code>import pulp  # import library
model = pulp.LpProblem("Maximising_problem", pulp.LpMaximize)  # initialize model
x = pulp.LpVariable('x', lowBound=0, cat='Integer')  # decision variable
y = pulp.LpVariable('y', lowBound=0, cat='Integer')  # decision variable
# objective function
model += 5000 * x + 2500 * y, "Profit"
# constraints
model += 3 * x + 2 * y <= 20
model += 4 * x + 3 * y <= 30
model += 4 * x + 3 * y <= 44
# solve model
model.solve()
print(pulp.LpStatus[model.status])
print(x.varValue)
print(y.varValue)
print(pulp.value(model.objective))</code>

Output:

<code>6.0
1.0
32500.0</code>

PuLP model representation:

<code>Maximising_problem:
MAXIMIZE
5000*x + 2500*y + 0
SUBJECT TO
_C1: 3 x + 2 y <= 20

_C2: 4 x + 3 y <= 30

_C3: 4 x + 3 y <= 44

VARIABLES
0 <= x Integer
0 <= y Integer</code>

Graph Theory Model

Given a cost matrix of ticket prices between cities (✖ indicates no direct connection), we find the cheapest ticket price between any two cities.
<code>import networkx as nx
import matplotlib.pyplot as plt
# Assume 'tickets' is a dictionary containing the cost matrix
G = nx.DiGraph()  # directed graph to handle asymmetric costs
for i in tickets.keys():
    for j in tickets[i].keys():
        G.add_edge(i, j, weight=tickets[i][j])
nx.draw_circular(G, with_labels=True)
plt.show()
# shortest path
path = nx.shortest_path(G, source='C', target='F', weight='weight')
length = nx.shortest_path_length(G, source='C', target='F', weight='weight')
print(path)   # ['C', 'D', 'F']
print(length) # 35</code>

Graph visualization:

Graph visualization
Graph visualization

Clustering Algorithms

Clustering groups unlabeled data into autonomous categories. The article demonstrates K‑means and hierarchical (Agglomerative) clustering using Scikit‑Learn.

<code>from sklearn import cluster
import pandas as pd
import numpy as np
# sample dataset
dataset = pd.DataFrame({
    'x': [11,11,20,12,16,33,24,14,45,52,51,52,55,53,55,61,62,70,72,10],
    'y': [39,36,30,52,53,46,55,59,12,15,16,18,11,23,14,8,18,7,24,70]
})
# K‑means
myKmeans = cluster.KMeans(n_clusters=2)
myKmeans.fit(dataset)
labels = myKmeans.labels_
centers = myKmeans.cluster_centers_
# Agglomerative clustering
myCluster = cluster.AgglomerativeClustering(n_clusters=2, affinity='euclidean', linkage='ward')
myCluster.fit(dataset)
agg_labels = myCluster.labels_
</code>

K‑means clustering result visualization:

K-means clustering
K-means clustering

Hierarchical clustering animation:

Agglomerative clustering
Agglomerative clustering

Dimensionality Reduction

Principal Component Analysis (PCA) reduces data dimensionality while preserving variance.

<code>from sklearn.decomposition import PCA
import pandas as pd
iris = pd.read_csv('data/iris.csv')
X = iris.drop('Species', axis=1)
pca = PCA(n_components=4)
pca.fit(X)
print(pca.explained_variance_ratio_)
# manually construct principal components (omitted for brevity)
</code>

PCA process animation:

PCA animation
PCA animation

Pattern Analysis

Frequent pattern mining discovers itemsets that appear with a frequency above a user‑specified threshold. The article focuses on the FP‑Growth algorithm and its Python implementation.

<code>import pandas as pd
import numpy as np
import pyfpgrowth as fp
# sample dataset
dict1 = {
    'id': [0, 1, 2, 3],
    'items': [
        ["wickets", "pads"],
        ["bat", "wickets", "pads", "helmet"],
        ["helmet", "pad"],
        ["bat", "pads", "helmet"]
    ]
}
transactionSet = pd.DataFrame(dict1)
patterns = fp.find_frequent_patterns(transactionSet['items'], 1)  # 1‑order frequent items
rules = fp.generate_association_rules(patterns, 0.3)          # rules with confidence >= 0.3
print(patterns)
print(rules)
</code>

FP‑Growth tree illustration:

FP-Growth tree
FP-Growth tree

Regression and Classification Models

Linear regression with Scikit‑Learn predicts continuous values.

<code>from sklearn.datasets import load_iris
from sklearn.linear_model import LinearRegression
X, y = load_iris().data[:, :2], load_iris().data[:, 2]
model = LinearRegression()
model.fit(X, y)
print('Score:', model.score(X, y))
print('Predictions:', model.predict(X))
</code>

Linear regression process animation:

Linear regression animation
Linear regression animation

K‑Nearest Neighbors (KNN) classification example (code similar to regression for illustration).

<code>from sklearn.datasets import load_iris
from sklearn.neighbors import KNeighborsClassifier
X, y = load_iris().data[:, :2], load_iris().target
knn = KNeighborsClassifier(n_neighbors=3)
knn.fit(X, y)
print('Score:', knn.score(X, y))
print('Predictions:', knn.predict(X))
</code>

KNN process animation:

KNN animation
KNN animation

References: https://zhuanlan.zhihu.com/p/444420809

OptimizationPythonClusteringregressionPCAgraph algorithmsFrequent Pattern Mining
Model Perspective
Written by

Model Perspective

Insights, knowledge, and enjoyment from a mathematical modeling researcher and educator. Hosted by Haihua Wang, a modeling instructor and author of "Clever Use of Chat for Mathematical Modeling", "Modeling: The Mathematics of Thinking", "Mathematical Modeling Practice: A Hands‑On Guide to Competitions", and co‑author of "Mathematical Modeling: Teaching Design and Cases".

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.