
en
0.25
0.5
0.75
1.25
1.5
1.75
2
Stochastic optimization with non-i.id. noise
Published on Feb 4, 20253985 Views
We study the convergence of a class of stable online algorithms for stochastic convex optimization in settings where we do not receive independent samples from the distribution over which we optimize
Related categories
Presentation
Stochastic optimization with non-i.i.d. noise00:00
Basic setup - 0103:26:34
Basic setup - 0220:12:48
Basic setup - 0323:35:37
Basic setup - 0432:18:30
Online Convex Optimization - 0141:22:56
Online Convex Optimization - 0261:21:19
I.I.D. sampling not always possible70:42:15
Formal setup - 0184:33:57
Formal setup - 02103:31:47
Example1: - 01124:23:13
Example1: - 02136:01:40
Example1: - 03146:39:59
Example 2:155:23:59
Stable online algorithms173:59:46
Convergence rate for convex losses - 01202:44:30
Convergence rate for convex losses - 02224:36:07
Specialization to particular algorithms233:56:26
Specialization to geometric mixing253:21:21
Simulation266:39:20
Example: - 01275:07:09
Example: - 02276:15:02
Example: - 03276:51:28
Example: - 04276:58:28
Convergence rate of MIGD279:02:52
Convergence plot for MIGD291:29:27
Strongly convex loss functions297:37:08
Better guarantees for strongly convex losses - 01305:43:25
Better guarantees for strongly convex losses - 02314:47:39
Fast rates for linear prediction - 01321:56:16
Fast rates for linear prediction - 02326:36:44