How Sensitivity Analysis Uncovers Shadow Prices and Slack in Linear Programming
This article explains the fundamentals of sensitivity analysis in linear programming, detailing how changes to objective coefficients, constraint bounds, and coefficients affect model outcomes, and demonstrates computing shadow prices and slack variables using Python's PuLP library with a glass manufacturing example.
1 Sensitivity Analysis
Conducting sensitivity analysis is crucial for decision makers to interpret models. Simply building a model is insufficient; we must explain it from various perspectives to maximize its utility.
Sensitivity analysis studies how changes in features affect a linear programming model. In this method, we vary one feature (e.g., a constraint) while keeping others constant and observe the impact on the model output.
We can perform sensitivity analysis in three ways:
Impact of changes in objective function coefficient values
Impact of changes in the right‑hand side constants of constraints
Impact of changes in the coefficients of constraints
2 Shadow Prices and Slack Variables
A shadow price indicates how much the objective function value changes when the right‑hand side constant of a constraint increases by one unit. A slack variable represents the amount of unused resource; it defines the feasibility range of a constraint. If the slack variable equals zero, the constraint is binding—altering it changes the optimal solution. A non‑binding constraint means any change within its range does not affect the optimal objective value.
3 Example
3.1 Decision Variables
A glass manufacturing company produces two products, A and B. Let:
A = quantity of type‑A glass
B = quantity of type‑B glass
3.2 Objective Function
Maximize profit:
Profit = 60·A + 50·B
3.3 Constraints
Constraint 1: 4·A + 10·B ≤ 100
Constraint 2: 2·A + 1·B ≤ 22
Constraint 3: 3·A + 3·B ≤ 39
All constraints must be satisfied simultaneously.
3.4 Solving Code
<code>from pulp import *
import pandas as pd
# Initialize Class, Define Vars., and Objective
model = LpProblem("Glass_Manufacturing_Industries_Profits", LpMaximize)
# Define variables
A = LpVariable('A', lowBound=0)
B = LpVariable('B', lowBound=0)
# Define Objective Function: Profit on Product A and B
model += 60 * A + 50 * B
# Constraint 1
model += 4 * A + 10 * B <= 100
# Constraint 2
model += 2 * A + 1 * B <= 22
# Constraint 3
model += 3 * A + 3 * B <= 39
# Solve Model
model.solve()
print("Model Status:{}".format(LpStatus[model.status]))
print("Objective = ", value(model.objective))
for v in model.variables():
print(v.name,"=", v.varValue)</code>Output:
Model Status:Optimal
Objective = 740.0
A = 9.0
B = 4.0 <code>o = [{'name':name,'shadow price':c.pi,'slack': c.slack} for name, c in model.constraints.items()]
pd.DataFrame(o)</code>3.5 Sensitivity Analysis: Calculating Shadow Prices and Slack Variables
The shadow price for constraint C2 is 10 and for C3 is 13.33, meaning a one‑unit increase in their right‑hand side constants would raise the objective function by those amounts respectively.
Constraint C1 has a slack value of 24, indicating it is a non‑binding (slack) constraint. Constraints C2 and C3 have slack of 0, making them binding; any change would affect the optimal objective.
4 Summary
This article introduced key concepts of linear programming sensitivity analysis—shadow prices and slack variables—and demonstrated a practical example solved with the PuLP library.
Reference
https://machinelearninggeek.com/sensitivity-analysis-in-python/
Model Perspective
Insights, knowledge, and enjoyment from a mathematical modeling researcher and educator. Hosted by Haihua Wang, a modeling instructor and author of "Clever Use of Chat for Mathematical Modeling", "Modeling: The Mathematics of Thinking", "Mathematical Modeling Practice: A Hands‑On Guide to Competitions", and co‑author of "Mathematical Modeling: Teaching Design and Cases".
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.