The sparsity of solutions is a recurrent requirement in many applications of operations research. Many variables can be forced to be equal to zero by introducing l0-norm or l1-norm terms into optimization models. In this thesis we would like to
Differentiable Architecture Search (DARTS) is a recent approach to discover state-of-the-art neural network architectures. Usually, this type of activity requires an unacceptable human effort, thus the interest in developing algorithmic solutions to automate the manual process of architecture design is
Standard Quadratic Optimization problems are an important family of problems with a large number of applications in finance, decision science, graphs algorithms, etc.For these nonconvex problems, we are interested in finding global optima; however, exact global optimization algorithms hardly scale
Artificial Neural Networks are powerful learning models that allow to obtain extraordinary performance in many complex tasks. However, state-of-the-art architectures often require the use of a huge number of parameters, making it hard to employ them with limited hardware resources.