Skip to content

Commit 3af51c8

Browse files
odowpkofod
andauthored
Update and reorganize README.md (#1076)
* Update and reorganize README.md * Update README.md Co-authored-by: Patrick Kofod Mogensen <patrick.mogensen@gmail.com> --------- Co-authored-by: Patrick Kofod Mogensen <patrick.mogensen@gmail.com>
1 parent ca3513f commit 3af51c8

File tree

1 file changed

+104
-130
lines changed

1 file changed

+104
-130
lines changed

README.md

Lines changed: 104 additions & 130 deletions
Original file line numberDiff line numberDiff line change
@@ -1,73 +1,87 @@
1-
Optim.jl
2-
========
1+
# Optim.jl
2+
3+
[![](https://img.shields.io/badge/docs-stable-blue.svg)](https://julianlsolvers.github.io/Optim.jl/stable)
4+
[![](https://img.shields.io/badge/docs-latest-blue.svg)](https://julianlsolvers.github.io/Optim.jl/dev)
5+
[![Build Status](https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/windows.yml/badge.svg)](https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/windows.yml)
6+
[![Build Status](https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/linux.yml/badge.svg)](https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/linux.yml)
7+
[![Build Status](https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/mac.yml/badge.svg)](https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/mac.yml)
8+
[![Codecov branch](https://img.shields.io/codecov/c/github/JuliaNLSolvers/Optim.jl/master.svg)](https://codecov.io/gh/JuliaNLSolvers/Optim.jl)
9+
[![JOSS](http://joss.theoj.org/papers/10.21105/joss.00615/status.svg)](https://doi.org/10.21105/joss.00615)
310

411
Univariate and multivariate optimization in Julia.
512

6-
Optim.jl is part of the [JuliaNLSolvers](https://github.com/JuliaNLSolvers) family.
13+
Optim.jl is part of the [JuliaNLSolvers](https://github.com/JuliaNLSolvers)
14+
family.
715

8-
For direct contact to the maintainer, you can reach out directly to pkofod on [slack](https://julialang.org/slack/).
16+
## Help and support
917

10-
| **Documentation** | **Build Status** | **Reference to cite** |
11-
|:-:|:-:|:-:|
12-
| [![][docs-stable-img]][docs-stable-url] | [![Build Status][build-linux-img]][build-linux-url] | [![JOSS][joss-img]][joss-url] |
13-
| |[![Build Status][build-mac-img]][build-mac-url] | |
14-
| |[![Build Status][build-windows-img]][build-windows-url] | |
15-
| |[![Codecov branch][cov-img]][cov-url] | |
18+
For help and support, please post on the [Optimization (Mathematical)](https://discourse.julialang.org/c/domain/opt/13)
19+
section of the Julia discourse or the `#math-optimization` channel of the Julia [slack](https://julialang.org/slack/).
1620

17-
# Optimization
21+
## Installation
1822

19-
Optim.jl is a package for univariate and multivariate optimization of functions.
20-
A typical example of the usage of Optim.jl is
23+
Install `Optim.jl` using the Julia package manager:
2124
```julia
22-
using Optim
23-
rosenbrock(x) = (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
24-
result = optimize(rosenbrock, zeros(2), BFGS())
25+
import Pkg
26+
Pkg.add("Optim")
2527
```
26-
This minimizes the [Rosenbrock function](https://en.wikipedia.org/wiki/Rosenbrock_function)
2728

28-
$$
29-
f(x, y) = (a - x)^2 + b(y - x^2)^2
30-
$$
29+
## Documentation
3130

32-
with $a = 1$, $b = 100$ and the initial values $x=0$, $y=0$.
33-
The minimum is at $(a,a^2)$.
31+
The online documentation is available at [https://julianlsolvers.github.io/Optim.jl/stable](https://julianlsolvers.github.io/Optim.jl/stable).
3432

35-
The above code gives the output
36-
```jlcon
33+
## Example
3734

38-
* Status: success
35+
To minimize the [Rosenbrock function](https://en.wikipedia.org/wiki/Rosenbrock_function),
36+
do:
37+
```julia
38+
julia> using Optim
3939

40-
* Candidate solution
41-
Minimizer: [1.00e+00, 1.00e+00]
42-
Minimum: 5.471433e-17
40+
julia> rosenbrock(x) = (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
41+
rosenbrock (generic function with 1 method)
4342

44-
* Found with
45-
Algorithm: BFGS
46-
Initial Point: [0.00e+00, 0.00e+00]
43+
julia> result = optimize(rosenbrock, zeros(2), BFGS())
44+
* Status: success
4745

48-
* Convergence measures
49-
|x - x'| = 3.47e-07 ≰ 0.0e+00
50-
|x - x'|/|x'| = 3.47e-07 ≰ 0.0e+00
51-
|f(x) - f(x')| = 6.59e-14 ≰ 0.0e+00
52-
|f(x) - f(x')|/|f(x')| = 1.20e+03 ≰ 0.0e+00
53-
|g(x)| = 2.33e-09 ≤ 1.0e-08
46+
* Candidate solution
47+
Final objective value: 5.471433e-17
5448

55-
* Work counters
56-
Seconds run: 0 (vs limit Inf)
57-
Iterations: 16
58-
f(x) calls: 53
59-
∇f(x) calls: 53
60-
```
61-
To get information on the keywords used to construct method instances, use the Julia REPL help prompt (`?`)
49+
* Found with
50+
Algorithm: BFGS
51+
52+
* Convergence measures
53+
|x - x'| = 3.47e-07 0.0e+00
54+
|x - x'|/|x'| = 3.47e-07 0.0e+00
55+
|f(x) - f(x')| = 6.59e-14 0.0e+00
56+
|f(x) - f(x')|/|f(x')| = 1.20e+03 0.0e+00
57+
|g(x)| = 2.33e-09 1.0e-08
58+
59+
* Work counters
60+
Seconds run: 0 (vs limit Inf)
61+
Iterations: 16
62+
f(x) calls: 53
63+
∇f(x) calls: 53
64+
65+
julia> Optim.minimizer(result)
66+
2-element Vector{Float64}:
67+
0.9999999926033423
68+
0.9999999852005355
69+
70+
julia> Optim.minimum(result)
71+
5.471432670590216e-17
6272
```
73+
74+
To get information on the keywords used to construct method instances, use the
75+
Julia REPL help prompt (`?`)
76+
```julia
6377
help?> LBFGS
6478
search: LBFGS
6579

66-
LBFGS
67-
≡≡≡≡≡≡≡
80+
LBFGS
81+
≡≡≡≡≡
6882

69-
Constructor
70-
=============
83+
Constructor
84+
===========
7185

7286
LBFGS(; m::Integer = 10,
7387
alphaguess = LineSearches.InitialStatic(),
@@ -77,58 +91,63 @@ search: LBFGS
7791
manifold = Flat(),
7892
scaleinvH0::Bool = true && (typeof(P) <: Nothing))
7993

80-
LBFGS has two special keywords; the memory length m, and
81-
the scaleinvH0 flag. The memory length determines how many
82-
previous Hessian approximations to store. When scaleinvH0
83-
== true, then the initial guess in the two-loop recursion
84-
to approximate the inverse Hessian is the scaled identity,
85-
as can be found in Nocedal and Wright (2nd edition) (sec.
86-
7.2).
94+
LBFGS has two special keywords; the memory length m, and the scaleinvH0 flag.
95+
The memory length determines how many previous Hessian approximations to
96+
store. When scaleinvH0 == true, then the initial guess in the two-loop
97+
recursion to approximate the inverse Hessian is the scaled identity, as can be
98+
found in Nocedal and Wright (2nd edition) (sec. 7.2).
8799

88-
In addition, LBFGS supports preconditioning via the P and
89-
precondprep keywords.
100+
In addition, LBFGS supports preconditioning via the P and precondprep keywords.
90101

91-
Description
92-
=============
102+
Description
103+
===========
93104

94-
The LBFGS method implements the limited-memory BFGS
95-
algorithm as described in Nocedal and Wright (sec. 7.2,
96-
2006) and original paper by Liu & Nocedal (1989). It is a
97-
quasi-Newton method that updates an approximation to the
105+
The LBFGS method implements the limited-memory BFGS algorithm as described in
106+
Nocedal and Wright (sec. 7.2, 2006) and original paper by Liu & Nocedal
107+
(1989). It is a quasi-Newton method that updates an approximation to the
98108
Hessian using past approximations as well as the gradient.
99109

100-
References
101-
============
110+
References
111+
==========
102112

103-
Wright, S. J. and J. Nocedal (2006), Numerical
104-
optimization, 2nd edition. Springer
113+
• Wright, S. J. and J. Nocedal (2006), Numerical optimization, 2nd edition.
114+
Springer
105115

106-
• Liu, D. C. and Nocedal, J. (1989). "On the
107-
Limited Memory Method for Large Scale
108-
Optimization". Mathematical Programming B. 45
109-
(3): 503–528
116+
• Liu, D. C. and Nocedal, J. (1989). "On the Limited Memory Method for
117+
Large Scale Optimization". Mathematical Programming B. 45 (3): 503528
110118
```
111119

112-
# Documentation
113-
For more details and options, see the documentation
114-
- [STABLE][docs-stable-url] — most recently tagged version of the documentation.
115-
- [LATEST][docs-latest-url] — in-development version of the documentation.
120+
## Use with JuMP
116121

117-
# Installation
118-
119-
The package is a registered package, and can be installed with `Pkg.add`.
122+
You can use Optim.jl with [JuMP.jl](https://github.com/jump-dev/JuMP.jl) as
123+
follows:
120124

121125
```julia
122-
julia> using Pkg; Pkg.add("Optim")
123-
```
124-
or through the `pkg` REPL mode by typing
125-
```
126-
] add Optim
126+
julia> using JuMP, Optim
127+
128+
julia> model = Model(Optim.Optimizer);
129+
130+
julia> set_optimizer_attribute(model, "method", BFGS())
131+
132+
julia> @variable(model, x[1:2]);
133+
134+
julia> @objective(model, Min, (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2)
135+
(x[1- 2 x[1] + 1) + (100.0 * ((-x[1+ x[2]) ^ 2.0))
136+
137+
julia> optimize!(model)
138+
139+
julia> objective_value(model)
140+
3.7218241804173566e-21
141+
142+
julia> value.(x)
143+
2-element Vector{Float64}:
144+
0.9999999999373603
145+
0.99999999986862
127146
```
128147

129-
# Citation
148+
## Citation
130149

131-
If you use `Optim.jl` in your work, please cite the following.
150+
If you use `Optim.jl` in your work, please cite the following:
132151

133152
```tex
134153
@article{mogensen2018optim,
@@ -142,48 +161,3 @@ If you use `Optim.jl` in your work, please cite the following.
142161
doi = {10.21105/joss.00615}
143162
}
144163
```
145-
146-
# Use with JuMP
147-
148-
We can use Optim.jl with [JuMP.jl](https://github.com/jump-dev/JuMP.jl).
149-
150-
This can be done using the `Optim.Optimizer` object. Here is how to create a JuMP
151-
model that uses Optim as the solver to minimize the rosenbrock function.
152-
153-
```julia
154-
using JuMP, Optim
155-
156-
model = Model(Optim.Optimizer)
157-
set_optimizer_attribute(model, "method", BFGS())
158-
159-
@variable(model, x[1:2])
160-
@objective(model, (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2)
161-
optimize!(model)
162-
```
163-
164-
[docs-latest-img]: https://img.shields.io/badge/docs-latest-blue.svg
165-
[docs-latest-url]: https://julianlsolvers.github.io/Optim.jl/latest
166-
167-
[docs-stable-img]: https://img.shields.io/badge/docs-stable-blue.svg
168-
[docs-stable-url]: https://julianlsolvers.github.io/Optim.jl/stable
169-
170-
[build-linux-img]: https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/linux.yml/badge.svg
171-
[build-linux-url]: https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/linux.yml
172-
173-
[build-windows-img]: https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/windows.yml/badge.svg
174-
[build-windows-url]: https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/windows.yml
175-
176-
[build-mac-img]: https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/mac.yml/badge.svg
177-
[build-mac-url]: https://github.com/JuliaNLSolvers/Optim.jl/actions/workflows/mac.yml
178-
179-
[cov-img]: https://img.shields.io/codecov/c/github/JuliaNLSolvers/Optim.jl/master.svg?maxAge=2592000
180-
[cov-url]: https://codecov.io/gh/JuliaNLSolvers/Optim.jl
181-
182-
[gitter-url]: https://gitter.im/JuliaNLSolvers/Optim.jl
183-
[gitter-img]: https://badges.gitter.im/JuliaNLSolvers/Optim.jl.svg
184-
185-
[zenodo-url]: https://zenodo.org/badge/latestdoi/3933868
186-
[zenodo-img]: https://zenodo.org/badge/3933868.svg
187-
188-
[joss-url]: https://doi.org/10.21105/joss.00615
189-
[joss-img]: http://joss.theoj.org/papers/10.21105/joss.00615/status.svg

0 commit comments

Comments
 (0)