Skip to content

Commit 2d741c5

Browse files
author
SciML Bot
committed
Apply JuliaFormatter to fix code formatting
- Applied JuliaFormatter with SciML style guide - Formatted 44 files 🤖 Generated by OrgMaintenanceScripts.jl
1 parent a9ceb38 commit 2d741c5

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

44 files changed

+1284
-1009
lines changed

.JuliaFormatter.toml

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
style = "sciml"
2+
format_markdown = true
3+
format_docstrings = true

README.md

Lines changed: 44 additions & 53 deletions
Original file line numberDiff line numberDiff line change
@@ -1,44 +1,34 @@
11
# ComponentArrays.jl
2-
![](assets/logo.png)
3-
4-
| **Documentation** | **Build Status** |
5-
|:-------------------------------------------------------------------------:|:-----------------------------------------------------------:|
6-
| [![][docs-stable-img]][docs-stable-url] [![][docs-dev-img]][docs-dev-url] | [![][build-img]][build-url] [![][codecov-img]][codecov-url] |
7-
82

9-
[docs-dev-img]: https://img.shields.io/badge/docs-latest-blue.svg
10-
[docs-dev-url]: https://docs.sciml.ai/ComponentArrays/dev/
11-
12-
[docs-stable-img]: https://img.shields.io/badge/docs-stable-blue.svg
13-
[docs-stable-url]: https://docs.sciml.ai/ComponentArrays/stable/
14-
15-
[build-img]: https://img.shields.io/github/actions/workflow/status/SciML/ComponentArrays.jl/ci.yml
16-
[build-url]: https://github.com/SciML/ComponentArrays.jl/actions/workflows/ci.yml
3+
![](assets/logo.png)
174

18-
[codecov-img]: https://img.shields.io/codecov/c/github/SciML/ComponentArrays.jl
19-
[codecov-url]: https://codecov.io/gh/SciML/ComponentArrays.jl
5+
| **Documentation** | **Build Status** |
6+
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
7+
| [![](https://img.shields.io/badge/docs-stable-blue.svg)](https://docs.sciml.ai/ComponentArrays/stable/) [![](https://img.shields.io/badge/docs-latest-blue.svg)](https://docs.sciml.ai/ComponentArrays/dev/) | [![](https://img.shields.io/github/actions/workflow/status/SciML/ComponentArrays.jl/ci.yml)](https://github.com/SciML/ComponentArrays.jl/actions/workflows/ci.yml) [![](https://img.shields.io/codecov/c/github/SciML/ComponentArrays.jl)](https://codecov.io/gh/SciML/ComponentArrays.jl) |
208

21-
The main export of this package is the ````ComponentArray```` type. "Components" of ````ComponentArray````s
22-
are really just array blocks that can be accessed through a named index. This will create a new ```ComponentArray``` whose data is a view into the original,
9+
The main export of this package is the ``ComponentArray`` type. "Components" of ``ComponentArray``s
10+
are really just array blocks that can be accessed through a named index. This will create a new `ComponentArray` whose data is a view into the original,
2311
allowing for standalone models to be composed together by simple function composition. In
24-
essence, ```ComponentArray```s allow you to do the things you would usually need a modeling
12+
essence, `ComponentArray`s allow you to do the things you would usually need a modeling
2513
language for, but without actually needing a modeling language. The main targets are for use
2614
in [DifferentialEquations.jl](https://github.com/SciML/DifferentialEquations.jl) and
2715
[Optim.jl](https://github.com/JuliaNLSolvers/Optim.jl), but anything that requires
2816
flat vectors is fair game.
2917

3018
Check out the [NEWS](https://github.com/SciML/ComponentArrays.jl/blob/master/NEWS.md) for new features by minor release version.
3119

32-
3320
## General use
34-
The easiest way to construct 1-dimensional ```ComponentArray```s (aliased as `ComponentVector`) is as if they were ```NamedTuple```s. In fact, a good way to think about them is as arbitrarily nested, mutable ```NamedTuple```s that can be passed through a solver.
21+
22+
The easiest way to construct 1-dimensional `ComponentArray`s (aliased as `ComponentVector`) is as if they were `NamedTuple`s. In fact, a good way to think about them is as arbitrarily nested, mutable `NamedTuple`s that can be passed through a solver.
23+
3524
```julia
36-
julia> c = (a=2, b=[1, 2]);
25+
julia> c = (a = 2, b = [1, 2]);
3726

38-
julia> x = ComponentArray(a=5, b=[(a=20., b=0), (a=33., b=0), (a=44., b=3)], c=c)
27+
julia> x = ComponentArray(a = 5, b = [(a = 20.0, b = 0), (a = 33.0, b = 0), (a = 44.0, b = 3)], c = c)
3928
ComponentVector{Float64}(a = 5.0, b = [(a = 20.0, b = 0.0), (a = 33.0, b = 0.0), (a = 44.0, b = 3.0)], c = (a = 2.0, b = [1.0, 2.0]))
4029

41-
julia> x.c.a = 400; x
30+
julia> x.c.a = 400;
31+
x
4232
ComponentVector{Float64}(a = 5.0, b = [(a = 20.0, b = 0.0), (a = 33.0, b = 0.0), (a = 44.0, b = 3.0)], c = (a = 400.0, b = [1.0, 2.0]))
4333

4434
julia> x[8]
@@ -57,21 +47,25 @@ julia> collect(x)
5747
1.0
5848
2.0
5949

60-
julia> typeof(similar(x, Int32)) === typeof(ComponentVector{Int32}(a=5, b=[(a=20., b=0), (a=33., b=0), (a=44., b=3)], c=c))
50+
julia> typeof(similar(x, Int32)) === typeof(ComponentVector{Int32}(a = 5, b = [
51+
(a = 20.0, b = 0), (a = 33.0, b = 0), (a = 44.0, b = 3)], c = c))
6152
true
6253
```
54+
6355
`ComponentArray`s can be constructed from existing
6456
`ComponentArray`s (currently nested fields cannot be changed this way):
57+
6558
```julia
66-
julia> x = ComponentVector(a=1, b=2, c=3);
59+
julia> x = ComponentVector(a = 1, b = 2, c = 3);
6760

68-
julia> ComponentVector(x; a=11, new=42)
61+
julia> ComponentVector(x; a = 11, new = 42)
6962
ComponentVector{Int64}(a = 11, b = 2, c = 3, new = 42)
7063
```
7164

72-
Higher dimensional ```ComponentArray```s can be created too, but it's a little messy at the moment. The nice thing for modeling is that dimension expansion through broadcasted operations can create higher-dimensional ```ComponentArray```s automatically, so Jacobian cache arrays that are created internally with ```false .* x .* x'``` will be two-dimensional ```ComponentArray```s (aliased as `ComponentMatrix`) with proper axes. Check out the [ODE with Jacobian](https://github.com/jonniedie/ComponentArrays.jl/blob/master/examples/ODE_jac_example.jl) example in the examples folder to see how this looks in practice.
65+
Higher dimensional `ComponentArray`s can be created too, but it's a little messy at the moment. The nice thing for modeling is that dimension expansion through broadcasted operations can create higher-dimensional `ComponentArray`s automatically, so Jacobian cache arrays that are created internally with `false .* x .* x'` will be two-dimensional `ComponentArray`s (aliased as `ComponentMatrix`) with proper axes. Check out the [ODE with Jacobian](https://github.com/jonniedie/ComponentArrays.jl/blob/master/examples/ODE_jac_example.jl) example in the examples folder to see how this looks in practice.
66+
7367
```julia
74-
julia> x = ComponentArray(a=1, b=[2, 1, 4.0], c=c)
68+
julia> x = ComponentArray(a = 1, b = [2, 1, 4.0], c = c)
7569
ComponentVector{Float64}(a = 1.0, b = [2.0, 1.0, 4.0], c = (a = 2.0, b = [1.0, 2.0]))
7670

7771
julia> x2 = x .* x'
@@ -84,42 +78,42 @@ julia> x2 = x .* x'
8478
1.0 2.0 1.0 4.0 2.0 1.0 2.0
8579
2.0 4.0 2.0 8.0 4.0 2.0 4.0
8680

87-
julia> x2[:c,:c]
81+
julia> x2[:c, :c]
8882
3×3 ComponentMatrix{Float64} with axes Axis(a = 1, b = 2:3) × Axis(a = 1, b = 2:3)
8983
4.0 2.0 4.0
9084
2.0 1.0 2.0
9185
4.0 2.0 4.0
9286

93-
julia> x2[:a,:a]
87+
julia> x2[:a, :a]
9488
1.0
9589

96-
julia> @view x2[:a,:c]
90+
julia> @view x2[:a, :c]
9791
ComponentVector{Float64,SubArray...}(a = 2.0, b = [1.0, 2.0])
9892

99-
julia> x2[:b,:c]
93+
julia> x2[:b, :c]
10094
3×3 ComponentMatrix{Float64} with axes FlatAxis() × Axis(a = 1, b = 2:3)
10195
4.0 2.0 4.0
10296
2.0 1.0 2.0
10397
8.0 4.0 8.0
10498
```
10599

106-
107100
## Examples
101+
108102
### Differential equation example
109-
This example uses ```@unpack``` from [Parameters.jl](https://github.com/mauro3/Parameters.jl)
103+
104+
This example uses `@unpack` from [Parameters.jl](https://github.com/mauro3/Parameters.jl)
110105
for nice syntax. Example taken from:
111106
https://github.com/JuliaDiffEq/ModelingToolkit.jl/issues/36#issuecomment-536221300
107+
112108
```julia
113109
using ComponentArrays
114110
using DifferentialEquations
115111
using Parameters: @unpack
116112

117-
118113
tspan = (0.0, 20.0)
119114

120-
121115
## Lorenz system
122-
function lorenz!(D, u, p, t; f=0.0)
116+
function lorenz!(D, u, p, t; f = 0.0)
123117
@unpack σ, ρ, β = p
124118
@unpack x, y, z = u
125119

@@ -129,41 +123,38 @@ function lorenz!(D, u, p, t; f=0.0)
129123
return nothing
130124
end
131125

132-
lorenz_p ==10.0, ρ=28.0, β=8/3)
133-
lorenz_ic = ComponentArray(x=0.0, y=0.0, z=0.0)
126+
lorenz_p = = 10.0, ρ = 28.0, β = 8/3)
127+
lorenz_ic = ComponentArray(x = 0.0, y = 0.0, z = 0.0)
134128
lorenz_prob = ODEProblem(lorenz!, lorenz_ic, tspan, lorenz_p)
135129

136-
137130
## Lotka-Volterra system
138-
function lotka!(D, u, p, t; f=0.0)
131+
function lotka!(D, u, p, t; f = 0.0)
139132
@unpack α, β, γ, δ = p
140133
@unpack x, y = u
141134

142-
D.x = α*x - β*x*y + f
135+
D.x = α*x - β*x*y + f
143136
D.y = -γ*y + δ*x*y
144137
return nothing
145138
end
146139

147-
lotka_p ==2/3, β=4/3, γ=1.0, δ=1.0)
148-
lotka_ic = ComponentArray(x=1.0, y=1.0)
140+
lotka_p = = 2/3, β = 4/3, γ = 1.0, δ = 1.0)
141+
lotka_ic = ComponentArray(x = 1.0, y = 1.0)
149142
lotka_prob = ODEProblem(lotka!, lotka_ic, tspan, lotka_p)
150143

151-
152144
## Composed Lorenz and Lotka-Volterra system
153145
function composed!(D, u, p, t)
154146
c = p.c #coupling parameter
155147
@unpack lorenz, lotka = u
156148

157-
lorenz!(D.lorenz, lorenz, p.lorenz, t, f=c*lotka.x)
158-
lotka!(D.lotka, lotka, p.lotka, t, f=c*lorenz.x)
149+
lorenz!(D.lorenz, lorenz, p.lorenz, t, f = c*lotka.x)
150+
lotka!(D.lotka, lotka, p.lotka, t, f = c*lorenz.x)
159151
return nothing
160152
end
161153

162-
comp_p = (lorenz=lorenz_p, lotka=lotka_p, c=0.01)
163-
comp_ic = ComponentArray(lorenz=lorenz_ic, lotka=lotka_ic)
154+
comp_p = (lorenz = lorenz_p, lotka = lotka_p, c = 0.01)
155+
comp_ic = ComponentArray(lorenz = lorenz_ic, lotka = lotka_ic)
164156
comp_prob = ODEProblem(composed!, comp_ic, tspan, comp_p)
165157

166-
167158
## Solve problem
168159
# We can solve the composed system...
169160
comp_sol = solve(comp_prob)
@@ -172,6 +163,6 @@ comp_sol = solve(comp_prob)
172163
lotka_sol = solve(lotka_prob)
173164
```
174165

175-
Notice how cleanly the ```composed!``` function can pass variables from one function to another with no array index juggling in sight. This is especially useful for large models as it becomes harder to keep track top-level model array position when adding new or deleting old components from the model. We could go further and compose ```composed!``` with other components ad (practically) infinitum with no mental bookkeeping.
166+
Notice how cleanly the `composed!` function can pass variables from one function to another with no array index juggling in sight. This is especially useful for large models as it becomes harder to keep track top-level model array position when adding new or deleting old components from the model. We could go further and compose `composed!` with other components ad (practically) infinitum with no mental bookkeeping.
176167

177-
The main benefit, however, is now our differential equations are unit testable. Both ```lorenz``` and ```lotka``` can be run as their own ```ODEProblem``` with ```f``` set to zero to see the unforced response.
168+
The main benefit, however, is now our differential equations are unit testable. Both `lorenz` and `lotka` can be run as their own `ODEProblem` with `f` set to zero to see the unforced response.

docs/make.jl

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -5,27 +5,27 @@ using Documenter.Remotes: GitHub
55
DocMeta.setdocmeta!(ComponentArrays, :DocTestSetup, :(using ComponentArrays))
66

77
makedocs(;
8-
modules=[ComponentArrays],
9-
format=Documenter.HTML(
10-
canonical="https://sciml.github.io/ComponentArrays.jl/stable",
8+
modules = [ComponentArrays],
9+
format = Documenter.HTML(
10+
canonical = "https://sciml.github.io/ComponentArrays.jl/stable",
1111
),
12-
pages=[
12+
pages = [
1313
"Home" => "index.md",
1414
"Quick Start" => "quickstart.md",
1515
"Indexing Behavior" => "indexing_behavior.md",
1616
"Unpacking to StaticArrays" => "static_unpack.md",
1717
"Examples" => [
1818
"examples/DiffEqFlux.md",
1919
"examples/adaptive_control.md",
20-
"examples/ODE_jac.md",
20+
"examples/ODE_jac.md"
2121
],
22-
"API" => "api.md",
22+
"API" => "api.md"
2323
],
24-
repo=GitHub("SciML/ComponentArrays.jl"),
25-
sitename="ComponentArrays.jl",
26-
authors="Jonnie Diegelman",
24+
repo = GitHub("SciML/ComponentArrays.jl"),
25+
sitename = "ComponentArrays.jl",
26+
authors = "Jonnie Diegelman"
2727
)
2828

2929
deploydocs(;
30-
repo="github.com/SciML/ComponentArrays.jl.git",
30+
repo = "github.com/SciML/ComponentArrays.jl.git",
3131
)

docs/src/api.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,4 +2,4 @@
22

33
```@autodocs
44
Modules = [ComponentArrays]
5-
```
5+
```

docs/src/examples/DiffEqFlux.md

Lines changed: 23 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,9 @@
11
# Neural ODEs with DiffEqFlux
2+
23
Let's see how easy it is to make dense neural ODE layers from scratch.
34
Flux is used here just for the `glorot_uniform` function and the `ADAM` optimizer.
45

5-
This example is taken from [the DiffEqFlux documentation](https://diffeqflux.sciml.ai/dev/Flux/).
6+
This example is taken from [the DiffEqFlux documentation](https://diffeqflux.sciml.ai/dev/Flux/).
67

78
```julia
89
using ComponentArrays
@@ -16,14 +17,15 @@ using Optim: LBFGS
1617
```
1718

1819
First, let's set up the problem and create the truth data.
20+
1921
```julia
20-
u0 = Float32[2.; 0.]
22+
u0 = Float32[2.0; 0.0]
2123
datasize = 30
2224
tspan = (0.0f0, 1.5f0)
2325

2426
function trueODEfunc(du, u, p, t)
2527
true_A = [-0.1 2.0; -2.0 -0.1]
26-
du .= ((u.^3)'true_A)'
28+
du .= ((u .^ 3)'true_A)'
2729
end
2830

2931
t = range(tspan[1], tspan[2], length = datasize)
@@ -32,27 +34,31 @@ ode_data = Array(solve(prob, Tsit5(), saveat = t))
3234
```
3335

3436
Next we'll make a function that creates dense neural layer components. It is similar to `Flux.Dense`, except it doesn't handle the activation function. We'll do that separately.
37+
3538
```julia
36-
dense_layer(in, out) = ComponentArray{Float32}(W=glorot_uniform(out, in), b=zeros(out))
39+
dense_layer(in, out) = ComponentArray{Float32}(W = glorot_uniform(out, in), b = zeros(out))
3740
```
3841

3942
Our parameter vector will be a `ComponentArray` that holds the ODE initial conditions and the dense neural layers.
43+
4044
```julia
41-
layers = (L1=dense_layer(2, 50), L2=dense_layer(50, 2))
42-
θ = ComponentArray(u=u0, p=layers)
45+
layers = (L1 = dense_layer(2, 50), L2 = dense_layer(50, 2))
46+
θ = ComponentArray(u = u0, p = layers)
4347
```
4448

4549
We now have convenient struct-like access to the weights and biases of the layers for our neural ODE function while giving our optimizer something that acts like a flat array.
50+
4651
```julia
4752
function dudt(u, p, t)
4853
@unpack L1, L2 = p
49-
return L2.W * tanh.(L1.W * u.^3 .+ L1.b) .+ L2.b
54+
return L2.W * tanh.(L1.W * u .^ 3 .+ L1.b) .+ L2.b
5055
end
5156

5257
prob = ODEProblem(dudt, u0, tspan)
5358
```
59+
5460
```julia
55-
predict_n_ode(θ) = Array(solve(prob, Tsit5(), u0=θ.u, p=θ.p, saveat=t))
61+
predict_n_ode(θ) = Array(solve(prob, Tsit5(), u0 = θ.u, p = θ.p, saveat = t))
5662

5763
function loss_n_ode(θ)
5864
pred = predict_n_ode(θ)
@@ -63,23 +69,25 @@ loss_n_ode(θ)
6369
```
6470

6571
Let's set up a training observation callback and train!
72+
6673
```julia
67-
cb = function (θ, loss, pred; doplot=false)
74+
cb = function (θ, loss, pred; doplot = false)
6875
display(loss)
6976
# plot current prediction against data
70-
pl = scatter(t, ode_data[1,:], label = "data")
71-
scatter!(pl, t, pred[1,:], label = "prediction")
77+
pl = scatter(t, ode_data[1, :], label = "data")
78+
scatter!(pl, t, pred[1, :], label = "prediction")
7279
display(plot(pl))
7380
return false
7481
end
7582
cb(θ, loss_n_ode(θ)...)
7683

7784
data = Iterators.repeated((), 1000)
7885

79-
res1 = sciml_train(loss_n_ode, θ, ADAM(0.05); cb=cb, maxiters=100)
80-
cb(res1.minimizer, loss_n_ode(res1.minimizer)...; doplot=true)
86+
res1 = sciml_train(loss_n_ode, θ, ADAM(0.05); cb = cb, maxiters = 100)
87+
cb(res1.minimizer, loss_n_ode(res1.minimizer)...; doplot = true)
8188

82-
res2 = sciml_train(loss_n_ode, res1.minimizer, LBFGS(); cb=cb)
83-
cb(res2.minimizer, loss_n_ode(res2.minimizer)...; doplot=true)
89+
res2 = sciml_train(loss_n_ode, res1.minimizer, LBFGS(); cb = cb)
90+
cb(res2.minimizer, loss_n_ode(res2.minimizer)...; doplot = true)
8491
```
92+
8593
![](../assets/DiffEqFlux.gif)

0 commit comments

Comments
 (0)