Skip to content

Constraint schedulers #31

@gallego-posada

Description

@gallego-posada

Enhancement

Enable "schedulers" for the constraints.

Consider a base CMP with objective function $f$ and inequality constraints $g \le \epsilon$. The scheduler could allow the construction of a "moving target" where the constraint is gradually strengthened. One could start with a sequence of optimization problems:
$$ \min_x f(x) \quad \text{s.t.} \quad g(x) \le \epsilon + \psi_t$$
such that the "slack" $\psi_t \rightarrow 0$.

Motivation

Theoretically, this sequence of optimization problems should yield equivalent solutions to that of the base CMP. However, specific (implementations of) algorithms can benefit from relaxing the optimization problem, especially towards the beginning of the optimization.

In the end, we might care about achieving an (approximately) feasible solution of the base CMP that has a good value of the objective function. Thus, there is no need to be overly strict at the beginning of training trying to achieve the "final" constraint level $\epsilon$.

"Curriculum learning" (Bengio et al., 2009) successfully the idea of gradually adjusting the problem difficulty for supervised learning tasks in ML.

Implementation proposal

Pytorch's learning rate schedulers are usually tied to a particular optimizer. For this reason they might not be directly portable for implementing constraint schedulers, but some of their scheduler framework and implementations could be re-used.

References

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions