Discussion:
[SciPy-Dev] SciPy-Dev Digest, Vol 161, Issue 11
Denis Akhiyarov
2017-03-11 15:08:17 UTC
Permalink
I used ipopt for interior-point method from python, are you trying to add
something similar to scipy? if yes, why not just add a wrapper for ipopt,
since the license looks not restrictive?
Send SciPy-Dev mailing list submissions to
To subscribe or unsubscribe via the World Wide Web, visit
https://mail.scipy.org/mailman/listinfo/scipy-dev
or, via email, send a message with subject or body 'help' to
You can reach the person managing the list at
When replying, please edit your Subject line so it is more specific
than "Re: Contents of SciPy-Dev digest..."
1. Re: GSoC2017: Constrained Optimisation in Scipy (Matt Haberland)
2. Re: GSoC2017: Constrained Optimisation in Scipy (Nikolay Mayorov)
----------------------------------------------------------------------
Message: 1
Date: Fri, 10 Mar 2017 10:01:32 -0800
Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy
mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
The choice of nonlinear optimization algorithm can have a dramatic impact
on the speed and quality of the solution, and the best choice for a
particular problem can be difficult to determine a priori, so it is
important to have multiple options available.
My work in optimal control leads to problems with (almost entirely)
nonlinear constraints, and the use of derivative information is essential
for reasonable performance, leaving SLSQP as the only option in SciPy right
now. However, the problems are also huge and very sparse with a specific
structure, so SLSQP is not very effective, and not nearly as effective as a
nonlinear optimization routine could be. So despite SciPy boasting 14
options for minimization of a nonlinear objective, it wasn't suitable for
this work (without the use of an external solver).
I think SciPy is in need of at least one solver designed to handle large,
fully nonlinear problems, and having two would be much better. Interior
point and SQP are good, complementary options.
Hello, my name is Antonio and I am a Brazilian electrical engineer
currently pursuing my master degree. I have contributed to scipy.optimize
and scipy.signal implementing functions "iirnotch", "irrpeak"
<https://github.com/scipy/scipy/pull/6404>and the method
"trust-region-exact" <https://github.com/scipy/scipy/pull/6919> (under
revision). I am interested in applying for the Google Summer of Code 2017
to work with the Scipy optimisation package.
My proposal is to improve scipy.optimize adding optimisation methods that
are able to deal with non-linear constraints. Currently the only
implemented methods able to deal with non-linear constraints are the
FORTRAN wrappers SLSQP and COBYLA.
SLSQP is a sequential quadratic programming method and COBYLA is a
SLSQP is not able to deal with sparse
hessians and jacobians and is unfit for large-scale problems and COBYLA,
as other derivative-free methods, is a good choice for optimise noisy
objective functions, however usually presents a poorer performance then
derivative-based methods when the derivatives are available (or even when
they are computed by automatic differentiation or finite differences).
My proposal is to implement in Scipy one or more state-of-the-art solvers
(interior point and SQP methods) for constrained optimisation problems. I
would like to get some feedback about this, discuss the relevance of it
for
Scipy and get some suggestions of possible mentors.
_______________________________________________
SciPy-Dev mailing list
https://mail.scipy.org/mailman/listinfo/scipy-dev
--
Matt Haberland
Assistant Adjunct Professor in the Program in Computing
Department of Mathematics
7620E Math Sciences Building, UCLA
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.scipy.org/pipermail/scipy-dev/
attachments/20170310/00ea2b5d/attachment-0001.html>
------------------------------
Message: 2
Date: Sat, 11 Mar 2017 00:40:00 +0500
Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy
Content-Type: text/plain; charset="utf-8"
Hi, Antonio!
I too think that moving towards more modern algorithms and their
implementations is good for scipy. I would be happy to mentor this project,
and most likely I will be able to.
1. Have you figured what is done in SLSQP (especially "least squares"
part)? Do you plan to use a similar approach or another approach to SQP? (I
figured there are several somewhat different approaches.) Setting on a
literature reference (or most likely several of them) is essential.
2. I think it is not wrong to focus on a single solver if you feel that it
will likely take the whole time. Or maybe you can prioritize: do this first
for sure and then have alternatives, plan a) to switch to another solver or
plan b) to improve\add something more minor.
3. Consider whether to fit a new solver into minimize or make it as a new
separate solver. The latter approach gives a freedom to implement things
exactly as you want (and not to depend on old suboptimal choices) , but I
guess it can be considered as impractical/inconsistent by some people.
Maybe it can be decided along the way.
4. I think it is important to start to think about benchmark problems
early, maybe even start with them. It's hard to develop a complicated
optimization algorithm without ability to see how efficiently it works
right away.
---- On Fri, 10 Mar 2017 23:01:32 +0500 Matt Haberland &
The choice of nonlinear optimization algorithm can have a dramatic impact
on the speed and quality of the solution, and the best choice for a
particular problem can be difficult to determine a priori, so it is
important to have multiple options available.
My work in optimal control leads to problems with (almost entirely)
nonlinear constraints, and the use of derivative information is essential
for reasonable performance, leaving SLSQP as the only option in SciPy right
now. However, the problems are also huge and very sparse with a specific
structure, so SLSQP is not very effective, and not nearly as effective as a
nonlinear optimization routine could be. So despite SciPy boasting 14
options for minimization of a nonlinear objective, it wasn't suitable for
this work (without the use of an external solver).
I think SciPy is in need of at least one solver designed to handle large,
fully nonlinear problems, and having two would be much better. Interior
point and SQP are good, complementary options.
--
Matt Haberland
Assistant Adjunct Professor in the Program in Computing
Department of Mathematics
7620E Math Sciences Building, UCLA
_______________________________________________
SciPy-Dev mailing list
https://mail.scipy.org/mailman/listinfo/scipy-dev
Hello, my name is Antonio and I am a Brazilian electrical engineer
currently pursuing my master degree. I have contributed to scipy.optimize
and scipy.signal implementing functions "iirnotch", "irrpeak"and the method
"trust-region-exact" (under revision). I am interested in applying for the
Google Summer of Code 2017 to work with the Scipy optimisation package.
My proposal is to improve scipy.optimize adding optimisation methods that
are able to deal with non-linear constraints. Currently the only
implemented methods able to deal with non-linear constraints are the
FORTRAN wrappers SLSQP and COBYLA.
SLSQP is a sequential quadratic programming method and COBYLA is a
derivative-free optimisation method, they both have its limitations: SLSQP
is not able to deal with sparse
hessians and jacobians and is unfit for large-scale problems and COBYLA,
as other derivative-free methods, is a good choice for optimise noisy
objective functions, however usually presents a poorer performance then
derivative-based methods when the derivatives are available (or even when
they are computed by automatic differentiation or finite differences).
My proposal is to implement in Scipy one or more state-of-the-art solvers
(interior point and SQP methods) for constrained optimisation problems. I
would like to get some feedback about this, discuss the relevance of it for
Scipy and get some suggestions of possible mentors.
_______________________________________________
SciPy-Dev mailing list
https://mail.scipy.org/mailman/listinfo/scipy-dev
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.scipy.org/pipermail/scipy-dev/
attachments/20170311/3bd6e920/attachment.html>
------------------------------
Subject: Digest Footer
_______________________________________________
SciPy-Dev mailing list
https://mail.scipy.org/mailman/listinfo/scipy-dev
------------------------------
End of SciPy-Dev Digest, Vol 161, Issue 11
******************************************
Matt Haberland
2017-03-11 17:01:40 UTC
Permalink
IPOPT is under EPL. From the FAQ:

*Does the EPL allow me to take the Source Code for a Program licensed under
it and include all or part of it in another program licensed under the GNU
General Public License (GPL), Berkeley Software Distribution (BSD) license
or other Open Source license?*
No. Only the owner of software can decide whether and how to license it to
others. Contributors to a Program licensed under the EPL understand that
source code for the Program will be made available under the terms of the
EPL. Unless you are the owner of the software or have received permission
from the owner, you are not authorized to apply the terms of another
license to the Program by including it in a program licensed under another
Open Source license.

It might be worth asking the contributors, but their permission would be
necessary.
Post by Denis Akhiyarov
I used ipopt for interior-point method from python, are you trying to add
something similar to scipy? if yes, why not just add a wrapper for ipopt,
since the license looks not restrictive?
Send SciPy-Dev mailing list submissions to
To subscribe or unsubscribe via the World Wide Web, visit
https://mail.scipy.org/mailman/listinfo/scipy-dev
or, via email, send a message with subject or body 'help' to
You can reach the person managing the list at
When replying, please edit your Subject line so it is more specific
than "Re: Contents of SciPy-Dev digest..."
1. Re: GSoC2017: Constrained Optimisation in Scipy (Matt Haberland)
2. Re: GSoC2017: Constrained Optimisation in Scipy (Nikolay Mayorov)
----------------------------------------------------------------------
Message: 1
Date: Fri, 10 Mar 2017 10:01:32 -0800
Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy
gmail.com>
Content-Type: text/plain; charset="utf-8"
The choice of nonlinear optimization algorithm can have a dramatic impact
on the speed and quality of the solution, and the best choice for a
particular problem can be difficult to determine a priori, so it is
important to have multiple options available.
My work in optimal control leads to problems with (almost entirely)
nonlinear constraints, and the use of derivative information is essential
for reasonable performance, leaving SLSQP as the only option in SciPy right
now. However, the problems are also huge and very sparse with a specific
structure, so SLSQP is not very effective, and not nearly as effective as a
nonlinear optimization routine could be. So despite SciPy boasting 14
options for minimization of a nonlinear objective, it wasn't suitable for
this work (without the use of an external solver).
I think SciPy is in need of at least one solver designed to handle large,
fully nonlinear problems, and having two would be much better. Interior
point and SQP are good, complementary options.
Hello, my name is Antonio and I am a Brazilian electrical engineer
currently pursuing my master degree. I have contributed to
scipy.optimize
and scipy.signal implementing functions "iirnotch", "irrpeak"
<https://github.com/scipy/scipy/pull/6404>and the method
"trust-region-exact" <https://github.com/scipy/scipy/pull/6919> (under
revision). I am interested in applying for the Google Summer of Code
2017
to work with the Scipy optimisation package.
My proposal is to improve scipy.optimize adding optimisation methods
that
are able to deal with non-linear constraints. Currently the only
implemented methods able to deal with non-linear constraints are the
FORTRAN wrappers SLSQP and COBYLA.
SLSQP is a sequential quadratic programming method and COBYLA is a
SLSQP is not able to deal with sparse
hessians and jacobians and is unfit for large-scale problems and COBYLA,
as other derivative-free methods, is a good choice for optimise noisy
objective functions, however usually presents a poorer performance then
derivative-based methods when the derivatives are available (or even
when
they are computed by automatic differentiation or finite differences).
My proposal is to implement in Scipy one or more state-of-the-art
solvers
(interior point and SQP methods) for constrained optimisation problems.
I
would like to get some feedback about this, discuss the relevance of it
for
Scipy and get some suggestions of possible mentors.
_______________________________________________
SciPy-Dev mailing list
https://mail.scipy.org/mailman/listinfo/scipy-dev
--
Matt Haberland
Assistant Adjunct Professor in the Program in Computing
Department of Mathematics
7620E Math Sciences Building, UCLA
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.scipy.org/pipermail/scipy-dev/attachments/
20170310/00ea2b5d/attachment-0001.html>
------------------------------
Message: 2
Date: Sat, 11 Mar 2017 00:40:00 +0500
Subject: Re: [SciPy-Dev] GSoC2017: Constrained Optimisation in Scipy
Content-Type: text/plain; charset="utf-8"
Hi, Antonio!
I too think that moving towards more modern algorithms and their
implementations is good for scipy. I would be happy to mentor this project,
and most likely I will be able to.
1. Have you figured what is done in SLSQP (especially "least squares"
part)? Do you plan to use a similar approach or another approach to SQP? (I
figured there are several somewhat different approaches.) Setting on a
literature reference (or most likely several of them) is essential.
2. I think it is not wrong to focus on a single solver if you feel that
it will likely take the whole time. Or maybe you can prioritize: do this
first for sure and then have alternatives, plan a) to switch to another
solver or plan b) to improve\add something more minor.
3. Consider whether to fit a new solver into minimize or make it as a new
separate solver. The latter approach gives a freedom to implement things
exactly as you want (and not to depend on old suboptimal choices) , but I
guess it can be considered as impractical/inconsistent by some people.
Maybe it can be decided along the way.
4. I think it is important to start to think about benchmark problems
early, maybe even start with them. It's hard to develop a complicated
optimization algorithm without ability to see how efficiently it works
right away.
---- On Fri, 10 Mar 2017 23:01:32 +0500 Matt Haberland &
The choice of nonlinear optimization algorithm can have a dramatic impact
on the speed and quality of the solution, and the best choice for a
particular problem can be difficult to determine a priori, so it is
important to have multiple options available.
My work in optimal control leads to problems with (almost entirely)
nonlinear constraints, and the use of derivative information is essential
for reasonable performance, leaving SLSQP as the only option in SciPy right
now. However, the problems are also huge and very sparse with a specific
structure, so SLSQP is not very effective, and not nearly as effective as a
nonlinear optimization routine could be. So despite SciPy boasting 14
options for minimization of a nonlinear objective, it wasn't suitable for
this work (without the use of an external solver).
I think SciPy is in need of at least one solver designed to handle large,
fully nonlinear problems, and having two would be much better. Interior
point and SQP are good, complementary options.
--
Matt Haberland
Assistant Adjunct Professor in the Program in Computing
Department of Mathematics
7620E Math Sciences Building, UCLA
_______________________________________________
SciPy-Dev mailing list
https://mail.scipy.org/mailman/listinfo/scipy-dev
Hello, my name is Antonio and I am a Brazilian electrical engineer
currently pursuing my master degree. I have contributed to scipy.optimize
and scipy.signal implementing functions "iirnotch", "irrpeak"and the method
"trust-region-exact" (under revision). I am interested in applying for the
Google Summer of Code 2017 to work with the Scipy optimisation package.
My proposal is to improve scipy.optimize adding optimisation methods that
are able to deal with non-linear constraints. Currently the only
implemented methods able to deal with non-linear constraints are the
FORTRAN wrappers SLSQP and COBYLA.
SLSQP is a sequential quadratic programming method and COBYLA is a
derivative-free optimisation method, they both have its limitations: SLSQP
is not able to deal with sparse
hessians and jacobians and is unfit for large-scale problems and COBYLA,
as other derivative-free methods, is a good choice for optimise noisy
objective functions, however usually presents a poorer performance then
derivative-based methods when the derivatives are available (or even when
they are computed by automatic differentiation or finite differences).
My proposal is to implement in Scipy one or more state-of-the-art solvers
(interior point and SQP methods) for constrained optimisation problems. I
would like to get some feedback about this, discuss the relevance of it for
Scipy and get some suggestions of possible mentors.
_______________________________________________
SciPy-Dev mailing list
https://mail.scipy.org/mailman/listinfo/scipy-dev
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.scipy.org/pipermail/scipy-dev/attachments/
20170311/3bd6e920/attachment.html>
------------------------------
Subject: Digest Footer
_______________________________________________
SciPy-Dev mailing list
https://mail.scipy.org/mailman/listinfo/scipy-dev
------------------------------
End of SciPy-Dev Digest, Vol 161, Issue 11
******************************************
_______________________________________________
SciPy-Dev mailing list
https://mail.scipy.org/mailman/listinfo/scipy-dev
Loading...