Lecture 3: Convergence Proof
Okay, if there’s no questions about last time, then I think we’ll just jump in and start in on subgradient methods. So far subgradient methods, we look at the – I mean, subgradient method is embarrassingly simple, right, it’s – you make a step in the negative in anegative, I’ll call the negative, but the correct English would be an anegative subgradient direction. ... See the whole transcript at [[http://see.stanford.edu/materials/lsocoee364b/transcripts/ConvexOptimizationII-Lecture03.pdf|Convex Optimization II - Lecture 03]]