Skip to content

Add SBPLX optimizer #924

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
May 16, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 5 additions & 1 deletion .pylintdict
Original file line number Diff line number Diff line change
Expand Up @@ -253,6 +253,7 @@ izaac
izz
jac
jacobian
johnson
jm
jonathan
jones
Expand Down Expand Up @@ -487,6 +488,7 @@ sanjiv
sashank
satisfiability
satyen
sbplx
scalability
schroediger
schroedinger
Expand Down Expand Up @@ -529,13 +531,15 @@ stdout
stefano
steppable
stepsize
steven
str
stratifications
stratification
subcircuits
subclassed
subclasses
subcomponents
subplex
submodules
subobjects
subseteq
Expand Down Expand Up @@ -627,4 +631,4 @@ zz
ω
φ_i
φ_ij
Δ
Δ
98 changes: 56 additions & 42 deletions qiskit_machine_learning/optimizers/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# This code is part of a Qiskit project.
#
# (C) Copyright IBM 2018, 2024.
# (C) Copyright IBM 2018, 2025.
#
# This code is licensed under the Apache License, Version 2.0. You may
# obtain a copy of this license in the LICENSE.txt file in the root directory
Expand Down Expand Up @@ -32,56 +32,57 @@
----------------------

.. autosummary::
:toctree: ../stubs/
:nosignatures:
:toctree: ../stubs/
:nosignatures:

OptimizerResult
Optimizer
Minimizer
OptimizerResult
Optimizer
Minimizer

Steppable optimization
----------------------

.. autosummary::
:toctree: ../stubs/
:toctree: ../stubs/

optimizer_utils
optimizer_utils

.. autosummary::
:toctree: ../stubs/
:nosignatures:
:toctree: ../stubs/
:nosignatures:

SteppableOptimizer
AskData
TellData
OptimizerState
SteppableOptimizer
AskData
TellData
OptimizerState


Local optimizers
----------------

.. autosummary::
:toctree: ../stubs/
:nosignatures:

ADAM
AQGD
CG
COBYLA
L_BFGS_B
GSLS
GradientDescent
GradientDescentState
NELDER_MEAD
NFT
P_BFGS
POWELL
SLSQP
SPSA
QNSPSA
TNC
SciPyOptimizer
UMDA
:toctree: ../stubs/
:nosignatures:

ADAM
AQGD
CG
COBYLA
L_BFGS_B
GSLS
GradientDescent
GradientDescentState
NELDER_MEAD
NFT
P_BFGS
POWELL
SLSQP
SPSA
QNSPSA
TNC
SciPyOptimizer
UMDA


The optimizers from
`scikit-quant <https://scikit-quant.readthedocs.io/en/latest/>`_ are not included in the
Expand All @@ -91,21 +92,32 @@
https://github.com/qiskit-community/qiskit-algorithms/issues/84.


Qiskit also provides local optimizers based on
`NLOpt <https://nlopt.readthedocs.io/en/latest/>`_.
See Global Optimizers section below for the optional NLOpt installation instructions.

.. autosummary::
:toctree: ../stubs/
:nosignatures:

SBPLX


Global optimizers
-----------------
The global optimizers here all use `NLOpt <https://nlopt.readthedocs.io/en/latest/>`_ for their
core function and can only be used if the optional dependent ``NLOpt`` package is installed.
To install the ``NLOpt`` dependent package you can use ``pip install nlopt``.

.. autosummary::
:toctree: ../stubs/
:nosignatures:
:toctree: ../stubs/
:nosignatures:

CRS
DIRECT_L
DIRECT_L_RAND
ESCH
ISRES
CRS
DIRECT_L
DIRECT_L_RAND
ESCH
ISRES

"""

Expand All @@ -123,6 +135,7 @@
from .nlopts.direct_l_rand import DIRECT_L_RAND
from .nlopts.esch import ESCH
from .nlopts.isres import ISRES
from .nlopts.sbplx import SBPLX
from .steppable_optimizer import SteppableOptimizer, AskData, TellData, OptimizerState
from .optimizer import Minimizer, Optimizer, OptimizerResult, OptimizerSupportLevel
from .p_bfgs import P_BFGS
Expand Down Expand Up @@ -165,5 +178,6 @@
"DIRECT_L_RAND",
"ESCH",
"ISRES",
"SBPLX",
"UMDA",
]
5 changes: 3 additions & 2 deletions qiskit_machine_learning/optimizers/nlopts/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# This code is part of a Qiskit project.
#
# (C) Copyright IBM 2018, 2024.
# (C) Copyright IBM 2018, 2025.
#
# This code is licensed under the Apache License, Version 2.0. You may
# obtain a copy of this license in the LICENSE.txt file in the root directory
Expand All @@ -10,11 +10,12 @@
# copyright notice, and modified files need to carry a notice indicating
# that they have been altered from the originals.

"""NLopt based global optimizers"""
"""NLopt-based global and local optimizers"""

from .crs import CRS
from .direct_l import DIRECT_L
from .direct_l_rand import DIRECT_L_RAND
from .esch import ESCH
from .isres import ISRES
from .sbplx import SBPLX
from .nloptimizer import NLoptOptimizer
6 changes: 4 additions & 2 deletions qiskit_machine_learning/optimizers/nlopts/nloptimizer.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# This code is part of a Qiskit project.
#
# (C) Copyright IBM 2018, 2024.
# (C) Copyright IBM 2018, 2025.
#
# This code is licensed under the Apache License, Version 2.0. You may
# obtain a copy of this license in the LICENSE.txt file in the root directory
Expand Down Expand Up @@ -33,12 +33,13 @@ class NLoptOptimizerType(Enum):
GN_DIRECT_L = 3
GN_ESCH = 4
GN_ISRES = 5
LN_SBPLX = 6


@_optionals.HAS_NLOPT.require_in_instance
class NLoptOptimizer(Optimizer):
"""
NLopt global optimizer base class
NLopt local and global optimizer base class
"""

_OPTIONS = ["max_evals"]
Expand All @@ -64,6 +65,7 @@ def __init__(self, max_evals: int = 1000) -> None: # pylint: disable=unused-arg
NLoptOptimizerType.GN_DIRECT_L: nlopt.GN_DIRECT_L,
NLoptOptimizerType.GN_ESCH: nlopt.GN_ESCH,
NLoptOptimizerType.GN_ISRES: nlopt.GN_ISRES,
NLoptOptimizerType.LN_SBPLX: nlopt.LN_SBPLX,
}

@abstractmethod
Expand Down
36 changes: 36 additions & 0 deletions qiskit_machine_learning/optimizers/nlopts/sbplx.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
# This code is part of a Qiskit project.
#
# (C) Copyright IBM 2025.
#
# This code is licensed under the Apache License, Version 2.0. You may
# obtain a copy of this license in the LICENSE.txt file in the root directory
# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.
#
# Any modifications or derivative works of this code must retain this
# copyright notice, and modified files need to carry a notice indicating
# that they have been altered from the originals.

"""Sbplx (Subplex) optimizer."""

from .nloptimizer import NLoptOptimizer, NLoptOptimizerType


class SBPLX(NLoptOptimizer):
"""
Subplex optimizer.

'Subplex (a variant of Nelder-Mead that uses Nelder-Mead on a sequence of subspaces)
is claimed to be much more efficient and robust than the original Nelder-Mead,
while retaining the latter's facility with discontinuous objectives.
While these claims seem to be true in many cases, we could not find any proof that
Subplex is globally convergent, and perhaps it may fail for some objective functions
like Nelder-Mead; YMMV.)', by Steven G. Johnson, author of NLopt library.

NLopt local optimizer, derivative-free.
For further detail, please refer to
https://nlopt.readthedocs.io/en/latest/NLopt_Algorithms/#sbplx-based-on-subplex
"""

def get_nlopt_optimizer(self) -> NLoptOptimizerType:
"""Return NLopt optimizer type."""
return NLoptOptimizerType.LN_SBPLX
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
---
features:
- |
Support for :class:`.SBPLX` optimizer from NLopt library has been added.
SBPLX is a local gradient-free optimizer based on Nelder-Mead and
is expected to show better convergence behavior.
Further information about this optimizer and the others can be found in
the API ref for the :mod:`~qiskit_machine_learning.optimizers`.
4 changes: 3 additions & 1 deletion test/optimizers/test_optimizers.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# This code is part of a Qiskit project.
#
# (C) Copyright IBM 2018, 2024.
# (C) Copyright IBM 2018, 2025.
#
# This code is licensed under the Apache License, Version 2.0. You may
# obtain a copy of this license in the LICENSE.txt file in the root directory
Expand Down Expand Up @@ -39,6 +39,7 @@
Optimizer,
P_BFGS,
POWELL,
SBPLX,
SLSQP,
SPSA,
QNSPSA,
Expand Down Expand Up @@ -221,6 +222,7 @@ def test_scipy_optimizer_parse_bounds(self):
(CRS, False),
(DIRECT_L, False),
(DIRECT_L_RAND, False),
(SBPLX, True),
)
@unpack
def test_nlopt(self, optimizer_cls, use_bound):
Expand Down