Finding the best solution under constraints.
Engineering Applications
- Resource Allocation: CPU, memory, budget distribution
- Performance Tuning: Finding optimal configurations
- ML Training: Gradient descent, hyperparameter tuning
- Scheduling: Task assignment, load balancing
- Cost Optimisation: Cloud spend, infrastructure sizing
Core Concepts
Objective Function
The thing you’re trying to minimise or maximise.
minimise f(x) subject to constraints g(x) ≤ 0
Local vs Global Optima
- Local: Best in neighbourhood
- Global: Best overall
Many problems have many local optima — need strategies to escape.
Gradient Descent
x_new = x_old - α * ∇f(x)
Move in the direction of steepest descent.
- α (learning rate): Step size
- ∇f: Gradient (direction of steepest ascent)
Convexity
A convex problem has one global optimum — much easier to solve.
Non-convex problems require more sophisticated approaches.
Optimisation Strategies
| Strategy | When to Use |
|---|---|
| Grid Search | Small parameter space, need exhaustive coverage |
| Random Search | Large space, diminishing returns from exhaustive |
| Gradient Descent | Differentiable objective |
| Bayesian Optimisation | Expensive evaluations, need sample efficiency |
| Genetic Algorithms | Complex landscapes, no gradient available |
System Design Connections
Hyperparameter Tuning
# Grid search
for lr in [0.001, 0.01, 0.1]:
for batch_size in [32, 64, 128]:
evaluate(lr, batch_size)
Cost-Performance Trade-offs
minimise: cost
subject to: latency_p99 ≤ 100ms
availability ≥ 99.9%
Auto-tuning
Systems that optimise their own parameters based on observed performance.
Key Insight
Most engineering decisions are optimisation problems in disguise:
- “What instance type should I use?” → Cost-performance optimisation
- “How many retries?” → Reliability-latency trade-off
- “Cache size?” → Memory-latency trade-off
Framing problems as optimisation clarifies the trade-offs.

