-
Notifications
You must be signed in to change notification settings - Fork 14
Description
Enhancement
Enable "schedulers" for the constraints.
Consider a base CMP with objective function
$$ \min_x f(x) \quad \text{s.t.} \quad g(x) \le \epsilon + \psi_t$$
such that the "slack"
Motivation
Theoretically, this sequence of optimization problems should yield equivalent solutions to that of the base CMP. However, specific (implementations of) algorithms can benefit from relaxing the optimization problem, especially towards the beginning of the optimization.
In the end, we might care about achieving an (approximately) feasible solution of the base CMP that has a good value of the objective function. Thus, there is no need to be overly strict at the beginning of training trying to achieve the "final" constraint level
"Curriculum learning" (Bengio et al., 2009) successfully the idea of gradually adjusting the problem difficulty for supervised learning tasks in ML.
Implementation proposal
Pytorch's learning rate schedulers are usually tied to a particular optimizer. For this reason they might not be directly portable for implementing constraint schedulers, but some of their scheduler framework and implementations could be re-used.
References
- Bengio, Louradour, Collobert and Weston (2009) Curriculum Learning [link to pdf]