You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: paper/paper.md
+12-6Lines changed: 12 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -39,11 +39,14 @@ header-includes: |
39
39
where $f: \mathbb{R}^n \to \mathbb{R}$ is continuously differentiable on $\mathbb{R}^n$, and $h: \mathbb{R}^n \to \mathbb{R} \cup \{+\infty\}$ is lower semi-continuous.
40
40
Both $f$ and $h$ may be nonconvex.
41
41
42
-
The library provides a modular and extensible framework for experimenting some nonsmooth nonconvex optimization algorithms, including:
42
+
The library provides a modular and extensible framework for experimenting with nonsmooth and nonconvex optimization algorithms, including:
43
43
44
44
-**Trust-region methods (TR, TRDH)**[@aravkin-baraldi-orban-2022] and [@leconte-orban-2023],
45
45
-**Quadratic regularization methods (R2, R2N)**[@diouane-habiboullah-orban-2024] and [@aravkin-baraldi-orban-2022],
These methods rely solely on the gradient and Hessian(-vector) information of the smooth part $f$ and the proximal mapping of the nonsmooth part $h$ in order to compute steps.
49
52
Then, the objective function $f + h$ is used only to accept or reject trial points.
@@ -53,7 +56,7 @@ Moreover, they can handle cases where Hessian approximations are unbounded [@dio
53
56
54
57
## Model-based framework for nonsmooth methods
55
58
56
-
There exists a way to solve \eqref{eq:nlp} in Julia using [ProximalAlgorithms.jl](https://github.com/JuliaFirstOrder/ProximalAlgorithms.jl), which implements in-place first-order line search–based methods for composite optimization.
59
+
There exists a way to solve \eqref{eq:nlp} in Julia using [ProximalAlgorithms.jl](https://github.com/JuliaFirstOrder/ProximalAlgorithms.jl), which implements in-place first-order line search–based methods for \eqref{eq:nlp}.
57
60
Most of these methods are generally splitting schemes that alternate between taking steps along the gradient of the smooth part $f$ (or quasi-Newton directions) and applying proximal steps on the nonsmooth part $h$.
58
61
Currently, **ProximalAlgorithms.jl** provides only L-BFGS as a quasi-Newton option.
59
62
By contrast, **RegularizedOptimization.jl** focuses on model-based approaches such as trust-region and regularization algorithms.
@@ -72,7 +75,7 @@ On the other hand, Hessian approximations of these functions, including quasi-Ne
72
75
73
76
Finally, nonsmooth terms $h$ can be modeled using [ProximalOperators.jl](https://github.com/JuliaSmoothOptimizers/ProximalOperators.jl), which provides a broad collection of nonsmooth functions, together with [ShiftedProximalOperators.jl](https://github.com/JuliaSmoothOptimizers/ShiftedProximalOperators.jl), which provides shifted proximal mappings for nonsmooth functions.
74
77
75
-
This modularity makes it easy to benchmark existing solvers available in the repository [@diouane-habiboullah-orban-2024], [@aravkin-baraldi-orban-2022], [@aravkin-baraldi-orban-2024], and [@leconte-orban-2023-2].
78
+
This modularity makes it easy to benchmark existing solvers available in the repository [@diouane-habiboullah-orban-2024;@aravkin-baraldi-orban-2022;@aravkin-baraldi-orban-2024;@leconte-orban-2023-2].
76
79
77
80
## Support for Hessians
78
81
@@ -83,7 +86,7 @@ A way to use Hessians is via automatic differentiation tools such as [ADNLPModel
83
86
84
87
The nonsmooth part $h$ must have a computable proximal mapping, defined as
This requirement is satisfied by a wide range of nonsmooth functions commonly used in practice, such as the $\ell_1$ norm, the $\ell_0$ "norm", indicator functions of convex sets, and group sparsity-inducing norms.
89
+
This requirement is satisfied by a wide range of nonsmooth functions commonly used in practice, such as $\ell_1$ norm, $\ell_0$ "norm", indicator functions of convex sets, and group sparsity-inducing norms.
87
90
The package [ProximalOperators.jl](https://www.github.com/FirstOrder/ProximalOperators.jl) provides a comprehensive collection of such functions, along with their proximal mappings.
88
91
The main difference between the proximal operators implemented in
@@ -96,6 +99,9 @@ where q is given, x and s are fixed shifts, h is the nonsmooth term with respect
96
99
to which we are computing the proximal operator, and χ(.; ΔB) is the indicator of
97
100
a ball of radius Δ defined by a certain norm.
98
101
102
+
{ width=70% }
103
+
104
+
99
105
## Testing and documentation
100
106
101
107
The package includes a comprehensive suite of unit tests that cover all functionalities, ensuring reliability and correctness.
@@ -134,7 +140,7 @@ All solvers in **RegularizedOptimization.jl** are implemented in an in-place fas
134
140
135
141
# Examples
136
142
137
-
We consider two examples where the smooth part $f$ is nonconvex and the nonsmooth part $h$ is either the $\ell_0$ or $\ell_1$ norm.
143
+
We consider two examples where the smooth part $f$ is nonconvex and the nonsmooth part $h$ is either $\ell_0$ or $\ell_1$ norm.
138
144
139
145
A first example is the FitzHugh-Nagumo inverse problem with an $\ell_1$ penalty, as described in [@aravkin-baraldi-orban-2022] and [@aravkin-baraldi-orban-2024].
0 commit comments