Skip to content

Commit e9a5745

Browse files
committedSep 10, 2024
python Convex Optimization visualization the process
1 parent 72703d0 commit e9a5745

File tree

2 files changed

+78
-0
lines changed

2 files changed

+78
-0
lines changed
 

‎convex_optimization_visualization.png

214 KB
Loading

‎convex_optimization_visualization.py

Lines changed: 78 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,78 @@
1+
## ---- python Convex Optimization visualization the process
2+
3+
# Visualizing the process of **Convex Optimization** typically involves showing how an algorithm (such as gradient descent) iterates toward the minimum of a convex function. Here's a general way to visualize this process in Python using `matplotlib` and `numpy`.
4+
5+
# ### Steps:
6+
# 1. **Generate a convex function**: For simplicity, use a quadratic function (e.g., \( f(x) = x^2 \)).
7+
# 2. **Apply an optimization algorithm**: Use gradient descent as the optimization algorithm.
8+
# 3. **Visualize the process**: Plot the function and show how the optimization algorithm converges to the minimum.
9+
10+
# Here's how you can implement it:
11+
12+
# ### Python Code:
13+
14+
# ```python
15+
import numpy as np
16+
import matplotlib.pyplot as plt
17+
18+
# Convex function (quadratic function)
19+
def f(x):
20+
return x ** 2
21+
22+
# Derivative of the convex function (gradient)
23+
def grad_f(x):
24+
return 2 * x
25+
26+
# Gradient descent algorithm
27+
def gradient_descent(starting_point, learning_rate, num_iterations):
28+
x = starting_point
29+
trajectory = [x]
30+
for _ in range(num_iterations):
31+
x = x - learning_rate * grad_f(x)
32+
trajectory.append(x)
33+
return np.array(trajectory)
34+
35+
# Visualization of the optimization process
36+
def visualize_optimization(trajectory):
37+
# Define x values and their corresponding function values
38+
x_vals = np.linspace(-3, 3, 400)
39+
y_vals = f(x_vals)
40+
41+
plt.figure(figsize=(8, 6))
42+
plt.plot(x_vals, y_vals, label=r'$f(x) = x^2$', color='blue')
43+
44+
# Plot the trajectory of the optimization algorithm
45+
for i, x in enumerate(trajectory):
46+
plt.plot(x, f(x), 'ro') # Mark the point
47+
plt.text(x, f(x), f'Iter {i}', fontsize=10)
48+
if i > 0:
49+
plt.arrow(trajectory[i-1], f(trajectory[i-1]),
50+
trajectory[i] - trajectory[i-1],
51+
f(trajectory[i]) - f(trajectory[i-1]),
52+
head_width=0.1, head_length=0.1, fc='green', ec='green')
53+
54+
plt.xlabel('x')
55+
plt.ylabel('f(x)')
56+
plt.title('Convex Optimization using Gradient Descent')
57+
plt.legend()
58+
plt.grid(True)
59+
plt.show()
60+
61+
# Parameters for gradient descent
62+
starting_point = 2.5
63+
learning_rate = 0.1
64+
num_iterations = 10
65+
66+
# Get the trajectory of the optimization process
67+
trajectory = gradient_descent(starting_point, learning_rate, num_iterations)
68+
69+
# Visualize the optimization process
70+
visualize_optimization(trajectory)
71+
# ```
72+
73+
# ### Explanation:
74+
# 1. **Convex function**: \( f(x) = x^2 \), which is a simple convex function.
75+
# 2. **Gradient descent**: The optimization algorithm is performed over a set number of iterations, updating \( x \) based on the gradient (which is \( 2x \)).
76+
# 3. **Visualization**: The function is plotted, and each iteration of the optimization is marked. The movement of the points toward the minimum is visualized with arrows.
77+
78+
# This code will generate a visualization where you can see the steps of the gradient descent as it moves towards the minimum at \( x = 0 \). You can adjust the learning rate and the number of iterations to observe different behaviors of the optimization process.

0 commit comments

Comments
 (0)
Please sign in to comment.