Discussion:
[SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy
Antonio Ribeiro
2017-03-09 22:38:49 UTC
Permalink
Hello, my name is Antonio and I am a Brazilian electrical engineer currently pursuing my master degree. I have contributed to scipy.optimize and scipy.signal implementing functions "iirnotch", "irrpeak" <https://github.com/scipy/scipy/pull/6404>and the method "trust-region-exact" <https://github.com/scipy/scipy/pull/6919> (under revision). I am interested in applying for the Google Summer of Code 2017 to work with the Scipy optimisation package.

My proposal is to improve scipy.optimize adding optimisation methods that are able to deal with non-linear constraints. Currently the only implemented methods able to deal with non-linear constraints are the FORTRAN wrappers SLSQP and COBYLA.

SLSQP is a sequential quadratic programming method and COBYLA is a derivative-free optimisation method, they both have its limitations: SLSQP is not able to deal with sparse
hessians and jacobians and is unfit for large-scale problems and COBYLA, as other derivative-free methods, is a good choice for optimise noisy objective functions, however usually presents a poorer performance then derivative-based methods when the derivatives are available (or even when they are computed by automatic differentiation or finite differences).

My proposal is to implement in Scipy one or more state-of-the-art solvers (interior point and SQP methods) for constrained optimisation problems. I would like to get some feedback about this, discuss the relevance of it for Scipy and get some suggestions of possible mentors.
Matt Haberland
2017-03-10 18:01:32 UTC
Permalink
This post might be inappropriate. Click to display it.
Nikolay Mayorov
2017-03-10 19:40:00 UTC
Permalink
This post might be inappropriate. Click to display it.
Benny Malengier
2017-03-11 18:55:33 UTC
Permalink
Just some thoughts,

I have seen quite some methods leveraging cvodes of the sundials suite to
obtain the derivates, eg CasADi, DOTcvpSB

The odes scikit I maintain has cvode interface, but not yet cvodes, though
that should not be too hard to add. Other python interfaces to sundials
might have it.

Independent, this gives you AD, but the work is in generating the correct
equations so as to leverage cvodes.

Benny
Post by Nikolay Mayorov
Hi, Antonio!
I too think that moving towards more modern algorithms and their
implementations is good for scipy. I would be happy to mentor this project,
and most likely I will be able to.
1. Have you figured what is done in SLSQP (especially "least squares"
part)? Do you plan to use a similar approach or another approach to SQP? (I
figured there are several somewhat different approaches.) Setting on a
literature reference (or most likely several of them) is essential.
2. I think it is not wrong to focus on a single solver if you feel that it
will likely take the whole time. Or maybe you can prioritize: do this first
for sure and then have alternatives, plan a) to switch to another solver or
plan b) to improve\add something more minor.
3. Consider whether to fit a new solver into minimize or make it as a new
separate solver. The latter approach gives a freedom to implement things
exactly as you want (and not to depend on old suboptimal choices) , but I
guess it can be considered as impractical/inconsistent by some people.
Maybe it can be decided along the way.
4. I think it is important to start to think about benchmark problems
early, maybe even start with them. It's hard to develop a complicated
optimization algorithm without ability to see how efficiently it works
right away.
---- On Fri, 10 Mar 2017 23:01:32 +0500 *Matt Haberland
The choice of nonlinear optimization algorithm can have a dramatic impact
on the speed and quality of the solution, and the best choice for a
particular problem can be difficult to determine a priori, so it is
important to have multiple options available.
My work in optimal control leads to problems with (almost entirely)
nonlinear constraints, and the use of derivative information is essential
for reasonable performance, leaving SLSQP as the only option in SciPy
right now. However, the problems are also huge and very sparse with a
specific structure, so SLSQP is not very effective, and not nearly as
effective as a nonlinear optimization routine could be. So despite SciPy
boasting 14 options for minimization of a nonlinear objective, it wasn't
suitable for this work (without the use of an external solver).
I think SciPy is in need of at least one solver designed to handle large,
fully nonlinear problems, and having two would be much better. Interior
point and SQP are good, complementary options.
--
Matt Haberland
Assistant Adjunct Professor in the Program in Computing
Department of Mathematics
7620E Math Sciences Building, UCLA
_______________________________________________
SciPy-Dev mailing list
https://mail.scipy.org/mailman/listinfo/scipy-dev
Hello, my name is Antonio and I am a Brazilian electrical engineer
currently pursuing my master degree. I have contributed to scipy.optimize
and scipy.signal implementing functions "iirnotch", "irrpeak"
<https://github.com/scipy/scipy/pull/6404>and the method
"trust-region-exact" <https://github.com/scipy/scipy/pull/6919> (under
revision). I am interested in applying for the Google Summer of Code 2017
to work with the Scipy optimisation package.
My proposal is to improve scipy.optimize adding optimisation methods that
are able to deal with non-linear constraints. Currently the only
implemented methods able to deal with non-linear constraints are the
FORTRAN wrappers SLSQP and COBYLA.
SLSQP is a sequential quadratic programming method and COBYLA is a
SLSQP is not able to deal with sparse
hessians and jacobians and is unfit for large-scale problems and COBYLA,
as other derivative-free methods, is a good choice for optimise noisy
objective functions, however usually presents a poorer performance then
derivative-based methods when the derivatives are available (or even when
they are computed by automatic differentiation or finite differences).
My proposal is to implement in Scipy one or more state-of-the-art solvers
(interior point and SQP methods) for constrained optimisation problems. I
would like to get some feedback about this, discuss the relevance of it for
Scipy and get some suggestions of possible mentors.
_______________________________________________
SciPy-Dev mailing list
https://mail.scipy.org/mailman/listinfo/scipy-dev
_______________________________________________
SciPy-Dev mailing list
https://mail.scipy.org/mailman/listinfo/scipy-dev
Suzen, Mehmet
2017-03-11 19:16:52 UTC
Permalink
I agree with Matt Haberland. A couple of different options would be
great. Some of the Rs code can be leveraged for barrier method [1].
Embedding Steven Johnson's NLopt into SciPy will be nice for nonlinear
constraints [2], a derivative-free method like COBYLA would be nice
to have [3].

-m

[1] https://stat.ethz.ch/R-manual/R-devel/library/stats/html/constrOptim.html
https://svn.r-project.org/R/trunk/src/library/stats/R/constrOptim.R

[2] http://ab-initio.mit.edu/wiki/index.php/NLopt
https://mail.scipy.org/pipermail/numpy-discussion/2010-June/051220.html

[3]http://ab-initio.mit.edu/wiki/index.php/NLopt_Algorithms#COBYLA_.28Constrained_Optimization_BY_Linear_Approximations.29
Loading...