1
- Optim.jl
2
- ========
1
+ # Optim.jl
2
+
3
+ [ ![ ] ( https://img.shields.io/badge/docs-stable-blue.svg )] ( https://julianlsolvers.github.io/Optim.jl/stable )
4
+ [ ![ ] ( https://img.shields.io/badge/docs-latest-blue.svg )] ( https://julianlsolvers.github.io/Optim.jl/dev )
5
+ [ ![ Build Status] ( https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/windows.yml/badge.svg )] ( https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/windows.yml )
6
+ [ ![ Build Status] ( https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/linux.yml/badge.svg )] ( https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/linux.yml )
7
+ [ ![ Build Status] ( https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/mac.yml/badge.svg )] ( https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/mac.yml )
8
+ [ ![ Codecov branch] ( https://img.shields.io/codecov/c/github/JuliaNLSolvers/Optim.jl/master.svg )] ( https://codecov.io/gh/JuliaNLSolvers/Optim.jl )
9
+ [ ![ JOSS] ( http://joss.theoj.org/papers/10.21105/joss.00615/status.svg )] ( https://doi.org/10.21105/joss.00615 )
3
10
4
11
Univariate and multivariate optimization in Julia.
5
12
6
- Optim.jl is part of the [ JuliaNLSolvers] ( https://github.com/JuliaNLSolvers ) family.
13
+ Optim.jl is part of the [ JuliaNLSolvers] ( https://github.com/JuliaNLSolvers )
14
+ family.
7
15
8
- For direct contact to the maintainer, you can reach out directly to pkofod on [ slack ] ( https://julialang.org/slack/ ) .
16
+ ## Help and support
9
17
10
- | ** Documentation** | ** Build Status** | ** Reference to cite** |
11
- | :-:| :-:| :-:|
12
- | [ ![ ] [ docs-stable-img ]] [ docs-stable-url ] | [ ![ Build Status] [ build-linux-img ]] [ build-linux-url ] | [ ![ JOSS] [ joss-img ]] [ joss-url ] |
13
- | | [ ![ Build Status] [ build-mac-img ]] [ build-mac-url ] | |
14
- | | [ ![ Build Status] [ build-windows-img ]] [ build-windows-url ] | |
15
- | | [ ![ Codecov branch] [ cov-img ]] [ cov-url ] | |
18
+ For help and support, please post on the [ Optimization (Mathematical)] ( https://discourse.julialang.org/c/domain/opt/13 )
19
+ section of the Julia discourse or the ` #math-optimization ` channel of the Julia [ slack] ( https://julialang.org/slack/ ) .
16
20
17
- # Optimization
21
+ ## Installation
18
22
19
- Optim.jl is a package for univariate and multivariate optimization of functions.
20
- A typical example of the usage of Optim.jl is
23
+ Install ` Optim.jl ` using the Julia package manager:
21
24
``` julia
22
- using Optim
23
- rosenbrock (x) = (1.0 - x[1 ])^ 2 + 100.0 * (x[2 ] - x[1 ]^ 2 )^ 2
24
- result = optimize (rosenbrock, zeros (2 ), BFGS ())
25
+ import Pkg
26
+ Pkg. add (" Optim" )
25
27
```
26
- This minimizes the [ Rosenbrock function] ( https://en.wikipedia.org/wiki/Rosenbrock_function )
27
28
28
- $$
29
- f(x, y) = (a - x)^2 + b(y - x^2)^2
30
- $$
29
+ ## Documentation
31
30
32
- with $a = 1$, $b = 100$ and the initial values $x=0$, $y=0$.
33
- The minimum is at $(a,a^2)$.
31
+ The online documentation is available at [ https://julianlsolvers.github.io/Optim.jl/stable ] ( https://julianlsolvers.github.io/Optim.jl/stable ) .
34
32
35
- The above code gives the output
36
- ``` jlcon
33
+ ## Example
37
34
38
- * Status: success
35
+ To minimize the [ Rosenbrock function] ( https://en.wikipedia.org/wiki/Rosenbrock_function ) ,
36
+ do:
37
+ ``` julia
38
+ julia> using Optim
39
39
40
- * Candidate solution
41
- Minimizer: [1.00e+00, 1.00e+00]
42
- Minimum: 5.471433e-17
40
+ julia> rosenbrock (x) = (1.0 - x[1 ])^ 2 + 100.0 * (x[2 ] - x[1 ]^ 2 )^ 2
41
+ rosenbrock (generic function with 1 method)
43
42
44
- * Found with
45
- Algorithm: BFGS
46
- Initial Point: [0.00e+00, 0.00e+00]
43
+ julia> result = optimize (rosenbrock, zeros (2 ), BFGS ())
44
+ * Status: success
47
45
48
- * Convergence measures
49
- |x - x'| = 3.47e-07 ≰ 0.0e+00
50
- |x - x'|/|x'| = 3.47e-07 ≰ 0.0e+00
51
- |f(x) - f(x')| = 6.59e-14 ≰ 0.0e+00
52
- |f(x) - f(x')|/|f(x')| = 1.20e+03 ≰ 0.0e+00
53
- |g(x)| = 2.33e-09 ≤ 1.0e-08
46
+ * Candidate solution
47
+ Final objective value: 5.471433e-17
54
48
55
- * Work counters
56
- Seconds run: 0 (vs limit Inf)
57
- Iterations: 16
58
- f(x) calls: 53
59
- ∇f(x) calls: 53
60
- ```
61
- To get information on the keywords used to construct method instances, use the Julia REPL help prompt (` ? ` )
49
+ * Found with
50
+ Algorithm: BFGS
51
+
52
+ * Convergence measures
53
+ | x - x' | = 3.47e-07 ≰ 0.0e+00
54
+ | x - x' | / | x' | = 3.47e-07 ≰ 0.0e+00
55
+ | f (x) - f (x' )| = 6.59e-14 ≰ 0.0e+00
56
+ | f (x) - f (x' )| / | f (x' )| = 1.20e+03 ≰ 0.0e+00
57
+ | g (x)| = 2.33e-09 ≤ 1.0e-08
58
+
59
+ * Work counters
60
+ Seconds run: 0 (vs limit Inf )
61
+ Iterations: 16
62
+ f (x) calls: 53
63
+ ∇f (x) calls: 53
64
+
65
+ julia> Optim. minimizer (result)
66
+ 2 - element Vector{Float64}:
67
+ 0.9999999926033423
68
+ 0.9999999852005355
69
+
70
+ julia> Optim. minimum (result)
71
+ 5.471432670590216e-17
62
72
```
73
+
74
+ To get information on the keywords used to construct method instances, use the
75
+ Julia REPL help prompt (` ? ` )
76
+ ``` julia
63
77
help?> LBFGS
64
78
search: LBFGS
65
79
66
- LBFGS
67
- ≡≡ ≡≡≡≡≡
80
+ LBFGS
81
+ ≡≡≡≡≡
68
82
69
- Constructor
70
- == ===========
83
+ Constructor
84
+ ===========
71
85
72
86
LBFGS (; m:: Integer = 10 ,
73
87
alphaguess = LineSearches. InitialStatic (),
@@ -77,58 +91,63 @@ search: LBFGS
77
91
manifold = Flat (),
78
92
scaleinvH0:: Bool = true && (typeof (P) <: Nothing ))
79
93
80
- LBFGS has two special keywords; the memory length m, and
81
- the scaleinvH0 flag. The memory length determines how many
82
- previous Hessian approximations to store. When scaleinvH0
83
- == true, then the initial guess in the two-loop recursion
84
- to approximate the inverse Hessian is the scaled identity,
85
- as can be found in Nocedal and Wright (2nd edition) (sec.
86
- 7.2).
94
+ LBFGS has two special keywords; the memory length m, and the scaleinvH0 flag.
95
+ The memory length determines how many previous Hessian approximations to
96
+ store. When scaleinvH0 == true , then the initial guess in the two- loop
97
+ recursion to approximate the inverse Hessian is the scaled identity, as can be
98
+ found in Nocedal and Wright (2 nd edition) (sec. 7.2 ).
87
99
88
- In addition, LBFGS supports preconditioning via the P and
89
- precondprep keywords.
100
+ In addition, LBFGS supports preconditioning via the P and precondprep keywords.
90
101
91
- Description
92
- == ===========
102
+ Description
103
+ ===========
93
104
94
- The LBFGS method implements the limited-memory BFGS
95
- algorithm as described in Nocedal and Wright (sec. 7.2,
96
- 2006) and original paper by Liu & Nocedal (1989). It is a
97
- quasi-Newton method that updates an approximation to the
105
+ The LBFGS method implements the limited- memory BFGS algorithm as described in
106
+ Nocedal and Wright (sec. 7.2 , 2006 ) and original paper by Liu & Nocedal
107
+ (1989 ). It is a quasi- Newton method that updates an approximation to the
98
108
Hessian using past approximations as well as the gradient.
99
109
100
- References
101
- == ==========
110
+ References
111
+ ========= =
102
112
103
- • Wright, S. J. and J. Nocedal (2006), Numerical
104
- optimization, 2nd edition. Springer
113
+ • Wright, S. J. and J. Nocedal (2006 ), Numerical optimization, 2 nd edition.
114
+ Springer
105
115
106
- • Liu, D. C. and Nocedal, J. (1989). "On the
107
- Limited Memory Method for Large Scale
108
- Optimization". Mathematical Programming B. 45
109
- (3): 503–528
116
+ • Liu, D. C. and Nocedal, J. (1989 ). " On the Limited Memory Method for
117
+ Large Scale Optimization" . Mathematical Programming B. 45 (3 ): 503 –528
110
118
```
111
119
112
- # Documentation
113
- For more details and options, see the documentation
114
- - [ STABLE] [ docs-stable-url ] — most recently tagged version of the documentation.
115
- - [ LATEST] [ docs-latest-url ] — in-development version of the documentation.
120
+ ## Use with JuMP
116
121
117
- # Installation
118
-
119
- The package is a registered package, and can be installed with ` Pkg.add ` .
122
+ You can use Optim.jl with [ JuMP.jl] ( https://github.com/jump-dev/JuMP.jl ) as
123
+ follows:
120
124
121
125
``` julia
122
- julia> using Pkg; Pkg. add (" Optim" )
123
- ```
124
- or through the ` pkg ` REPL mode by typing
125
- ```
126
- ] add Optim
126
+ julia> using JuMP, Optim
127
+
128
+ julia> model = Model (Optim. Optimizer);
129
+
130
+ julia> set_optimizer_attribute (model, " method" , BFGS ())
131
+
132
+ julia> @variable (model, x[1 : 2 ]);
133
+
134
+ julia> @objective (model, Min, (1.0 - x[1 ])^ 2 + 100.0 * (x[2 ] - x[1 ]^ 2 )^ 2 )
135
+ (x[1 ]² - 2 x[1 ] + 1 ) + (100.0 * ((- x[1 ]² + x[2 ]) ^ 2.0 ))
136
+
137
+ julia> optimize! (model)
138
+
139
+ julia> objective_value (model)
140
+ 3.7218241804173566e-21
141
+
142
+ julia> value .(x)
143
+ 2 - element Vector{Float64}:
144
+ 0.9999999999373603
145
+ 0.99999999986862
127
146
```
128
147
129
- # Citation
148
+ ## Citation
130
149
131
- If you use ` Optim.jl ` in your work, please cite the following.
150
+ If you use ` Optim.jl ` in your work, please cite the following:
132
151
133
152
``` tex
134
153
@article{mogensen2018optim,
@@ -142,48 +161,3 @@ If you use `Optim.jl` in your work, please cite the following.
142
161
doi = {10.21105/joss.00615}
143
162
}
144
163
```
145
-
146
- # Use with JuMP
147
-
148
- We can use Optim.jl with [ JuMP.jl] ( https://github.com/jump-dev/JuMP.jl ) .
149
-
150
- This can be done using the ` Optim.Optimizer ` object. Here is how to create a JuMP
151
- model that uses Optim as the solver to minimize the rosenbrock function.
152
-
153
- ``` julia
154
- using JuMP, Optim
155
-
156
- model = Model (Optim. Optimizer)
157
- set_optimizer_attribute (model, " method" , BFGS ())
158
-
159
- @variable (model, x[1 : 2 ])
160
- @objective (model, (1.0 - x[1 ])^ 2 + 100.0 * (x[2 ] - x[1 ]^ 2 )^ 2 )
161
- optimize! (model)
162
- ```
163
-
164
- [ docs-latest-img ] : https://img.shields.io/badge/docs-latest-blue.svg
165
- [ docs-latest-url ] : https://julianlsolvers.github.io/Optim.jl/latest
166
-
167
- [ docs-stable-img ] : https://img.shields.io/badge/docs-stable-blue.svg
168
- [ docs-stable-url ] : https://julianlsolvers.github.io/Optim.jl/stable
169
-
170
- [ build-linux-img ] : https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/linux.yml/badge.svg
171
- [ build-linux-url ] : https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/linux.yml
172
-
173
- [ build-windows-img ] : https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/windows.yml/badge.svg
174
- [ build-windows-url ] : https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/windows.yml
175
-
176
- [ build-mac-img ] : https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/mac.yml/badge.svg
177
- [ build-mac-url ] : https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/mac.yml
178
-
179
- [ cov-img ] : https://img.shields.io/codecov/c/github/JuliaNLSolvers/Optim.jl/master.svg?maxAge=2592000
180
- [ cov-url ] : https://codecov.io/gh/JuliaNLSolvers/Optim.jl
181
-
182
- [ gitter-url ] : https://gitter.im/JuliaNLSolvers/Optim.jl
183
- [ gitter-img ] : https://badges.gitter.im/JuliaNLSolvers/Optim.jl.svg
184
-
185
- [ zenodo-url ] : https://zenodo.org/badge/latestdoi/3933868
186
- [ zenodo-img ] : https://zenodo.org/badge/3933868.svg
187
-
188
- [ joss-url ] : https://doi.org/10.21105/joss.00615
189
- [ joss-img ] : http://joss.theoj.org/papers/10.21105/joss.00615/status.svg
0 commit comments