▸ Gradient Descent Explorer
Tehnici de Optimizare — Laborator interactiv
Minimizing
f(x₁,x₂) = ½(κ·x₁² + x₂²)
L =
—
σ =
—
κ =
—
x* =
—
Q =
—
r =
—
Contour Plot + Iterates
Click to set x⁰
Convergence: f(xᵏ) − f*
Method
GD
SGD
Proj. GD
Cond. GD
Objective Function
Step-Size Strategy
α = constant
Exact line search
Backtracking (Armijo)
Step-Size Strategy (PGD)
α = constant
α = 1/L
Backtracking (Armijo + proj.)
Constraint Set Q
‖x‖₂ ≤ r (Euclidean ball)
‖x‖₁ ≤ r (Diamond / L₁)
‖x‖∞ ≤ r (Box / L∞)
x ≥ 0, Σxᵢ ≤ r (Simplex)
Step-Size Strategy (FW)
αₖ = 2/(k+2) (standard)
αₖ = α (constant)
Line search on [0,1]
Parameters
Step size α
0.010
Armijo c
0.30
Armijo ρ
0.50
Constraint radius r
3.00
Step size α
0.10
Step size α
0.050
Mini-batch N
1
Decay τ (αₖ = α₀/(1+k/τ))
50
Samples m (components)
30
Max iterations
200
Condition κ(A)
10
Controls
▶ Run
Step
Reset
Results
Iterations:
—
f(xᵏ):
—
‖∇f(xᵏ)‖:
—
Duality gap gₖ:
—
αₖ used:
—
Status:
Ready