Suppose we are given a convex function $f(\cdot)$ on $[0,1]$. One wants to solve the following optimization problem:
\begin{equation} \begin{aligned} & \text{minimize} && \sum_{i=1}^n \alpha_i f(x_i), \\ & \text{subject to} && \sum_{j=1}^n \beta_j g_l(x_j) \leq 0, \text{ for } l \in \{1,2,...,m\}, \end{aligned} \end{equation}
where $x_i \in [0,1]$ for $i \in \{1,2,...,n\}$; $\alpha_i$s and $\beta_j$s are also in $[0,1]$. Also, $g_l(\cdot)$ is linear on [0,1], for $l \in \{1,2,...,m\}$.
I am thinking how to use gradient descent method to solve this problem. i.e., assume we repeat updating the variables via, say, $x_i ^{(t+1)} = x_i ^{(t)} - a f^{'}(x_i ^{(t)})$ for the $t$-th iteration, where $a$ is some step size. Since the constraints might be violated after the update, how can we make the constraints satisfied while moving the variables towards the optimum ?
In particular, in each iteration, one has to keep an eye on how the functions in the constraints are going, i.e., one might want to compute also the derivatives of the functions in the constraints. But do we have a general method to control this procedure ?