Skip to content
Merged
Show file tree
Hide file tree
Changes from 8 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,6 @@ Manifest.toml
.DS_Store
/docs/build/*
/docs/src/libs/*/examples
/docs/src/assets/Project.toml
hall_of_fame.*
/tmp/*
/tmp/*
4 changes: 3 additions & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ ProgressMeter = "1.6"
QuadGK = "2.4"
RecipesBase = "1"
Reexport = "1.0"
OrdinaryDiffEqTsit5 = "1"
Setfield = "1"
Statistics = "1"
StatsBase = "0.32.0, 0.33, 0.34"
Expand All @@ -43,10 +44,11 @@ Symbolics = "5.30.1, 6"
julia = "1.10"

[extras]
OrdinaryDiffEqTsit5 = "b1df2697-797e-41e3-8120-5422d3b24e4a"
Pkg = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
SafeTestsets = "1bc83da4-3b8d-516f-aca4-4fe02f6d838f"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

[targets]
test = ["Test", "Random", "SafeTestsets", "Pkg"]
test = ["Test", "Random", "SafeTestsets", "Pkg", "OrdinaryDiffEqTsit5"]
13 changes: 9 additions & 4 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@ dev_subpkg("DataDrivenDMD")
dev_subpkg("DataDrivenSparse")
dev_subpkg("DataDrivenSR")

Pkg.precompile()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This shouldn't be necessary?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just slightly faster to precompile everything in parallel, instead of doing it piecewise during the next using statements. But definetely not strictly necessary.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Parallel precompilation on using should be available on recent julia versions (1.10+ or even earlier).


using Documenter
using DataDrivenDiffEq
using DataDrivenDMD
Expand All @@ -21,8 +23,8 @@ using DataDrivenSR
using StatsBase
using Literate

cp("./docs/Manifest.toml", "./docs/src/assets/Manifest.toml", force = true)
cp("./docs/Project.toml", "./docs/src/assets/Project.toml", force = true)
cp(joinpath(@__DIR__, "Manifest.toml"), joinpath(@__DIR__, "src", "assets", "Manifest.toml"), force = true)
cp(joinpath(@__DIR__, "Project.toml"), joinpath(@__DIR__, "src", "assets", "Project.toml"), force = true)

ENV["GKSwstype"] = "100"

Expand Down Expand Up @@ -73,9 +75,12 @@ makedocs(sitename = "DataDrivenDiffEq.jl",
modules = [DataDrivenDiffEq, DataDrivenDMD, DataDrivenSparse, DataDrivenSR],
clean = true, doctest = false, linkcheck = true,
warnonly = [:missing_docs, :cross_references],
linkcheck_ignore = ["http://cwrowley.princeton.edu/papers/Hemati-2017a.pdf",
linkcheck_ignore = [
"http://cwrowley.princeton.edu/papers/Hemati-2017a.pdf",
"https://royalsocietypublishing.org/doi/10.1098/rspa.2020.0279",
"https://www.pnas.org/doi/10.1073/pnas.1517384113"],
"https://www.pnas.org/doi/10.1073/pnas.1517384113",
"https://link.springer.com/article/10.1007/s00332-015-9258-5",
],
format = Documenter.HTML(assets = ["assets/favicon.ico"],
canonical = "https://docs.sciml.ai/DataDrivenDiffEq/stable/"),
pages = pages)
Expand Down
16 changes: 0 additions & 16 deletions docs/src/assets/Project.toml

This file was deleted.

21 changes: 12 additions & 9 deletions docs/src/citations.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,16 +15,19 @@ If you are using `DataDrivenDiffEq.jl` for research, please cite
}
```

If you are using the [SymbolicRegression.jl](https://docs.sciml.ai/SymbolicRegression/stable/) API, please cite
If you are using the [SymbolicRegression.jl](https://ai.damtp.cam.ac.uk/symbolicregression/dev/) API, please cite

```bibtex
@software{pysr,
author = {Miles Cranmer},
title = {PySR: Fast \& Parallelized Symbolic Regression in Python/Julia},
month = sep,
year = 2020,
publisher = {Zenodo},
doi = {10.5281/zenodo.4041459},
url = {http://doi.org/10.5281/zenodo.4041459}
@misc{cranmerInterpretableMachineLearning2023,
title = {Interpretable {Machine} {Learning} for {Science} with {PySR} and {SymbolicRegression}.jl},
url = {http://arxiv.org/abs/2305.01582},
doi = {10.48550/arXiv.2305.01582},
urldate = {2023-07-17},
publisher = {arXiv},
author = {Cranmer, Miles},
month = may,
year = {2023},
note = {arXiv:2305.01582 [astro-ph, physics:physics]},
keywords = {Astrophysics - Instrumentation and Methods for Astrophysics, Computer Science - Machine Learning, Computer Science - Neural and Evolutionary Computing, Computer Science - Symbolic Computation, Physics - Data Analysis, Statistics and Probability},
}
```
2 changes: 1 addition & 1 deletion docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ Pkg.add("DataDrivenSR")
## Contributing

- Please refer to the
[SciML ColPrac: Contributor's Guide on Collaborative Practices for Community Packages](https://github.com/SciML/ColPrac/blob/master/README.md)
[SciML ColPrac: Contributor's Guide on Collaborative Practices for Community Packages](https://github.com/SciML/ColPrac)
for guidance on PRs, issues, and other matters relating to contributing to SciML.

- See the [SciML Style Guide](https://github.com/SciML/SciMLStyle) for common coding practices and other style decisions.
Expand Down
2 changes: 1 addition & 1 deletion docs/src/libs/datadrivendmd/example_04.jl
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# # [Nonlinear Time Continuous System](@id nonlinear_continuos)
#
# Similarly, we can use the [Extended Dynamic Mode Decomposition](https://link.springer.com/article/10.1007/s00332-015-9258-5) via a nonlinear [`Basis`](@ref) of observables. Here, we will look at a rather [famous example](https://arxiv.org/pdf/1510.03007.pdf) with a finite dimensional solution.
# Similarly, we can use the [Extended Dynamic Mode Decomposition](https://link.springer.com/article/10.1007/s00332-015-9258-5) via a nonlinear [`Basis`](@ref) of observables. Here, we will look at a rather [famous example](https://arxiv.org/pdf/1510.03007) with a finite dimensional solution.

using DataDrivenDiffEq
using LinearAlgebra
Expand Down
2 changes: 1 addition & 1 deletion docs/src/libs/datadrivensparse/example_03.jl
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# # [Implicit Nonlinear Dynamics : Michaelis-Menten](@id michaelis_menten)
#
# What if you want to estimate an implicitly defined system of the form ``f(u_t, u, p, t) = 0``?
# The solution : Implicit Sparse Identification. This method was originally described in [this paper](http://ieeexplore.ieee.org/document/7809160/), and currently there exist [robust algorithms](https://royalsocietypublishing.org/doi/10.1098/rspa.2020.0279) to identify these systems.
# The solution : Implicit Sparse Identification. This method was originally described in [this paper](https://ieeexplore.ieee.org/document/7809160/), and currently there exist [robust algorithms](https://royalsocietypublishing.org/doi/10.1098/rspa.2020.0279) to identify these systems.
# We will focus on [Michaelis-Menten Kinetics](https://en.wikipedia.org/wiki/Michaelis%E2%80%93Menten_kinetics). As before, we will define the [`DataDrivenProblem`](@ref) and the [`Basis`](@ref) containing possible candidate functions for our [sparse regression](@ref sparse_algorithms).
# Let's generate some data! We will use two experiments starting from different initial conditions.

Expand Down
40 changes: 18 additions & 22 deletions docs/src/libs/datadrivensparse/example_04.jl
Original file line number Diff line number Diff line change
Expand Up @@ -5,45 +5,41 @@

using DataDrivenDiffEq
using ModelingToolkit
using ModelingToolkit: t_nounits as t, D_nounits as D
using OrdinaryDiffEq
using LinearAlgebra
using DataDrivenSparse
#md using Plots
#md gr()
using Test #src

@parameters begin
t
α = 1.0
β = 1.3
γ = 2.0
δ = 0.5
@mtkmodel Autoregulation begin
@parameters begin
α = 1.0
β = 1.3
γ = 2.0
δ = 0.5
end
@variables begin
(x(t))[1:2] = [20.0; 12.0]
end
@equations begin
D(x[1]) ~ α / (1 + x[2]) - β * x[1]
D(x[2]) ~ γ / (1 + x[1]) - δ * x[2]
end
end

@variables begin
x[1:2](t) = [20.0; 12.0]
end

x = collect(x)
D = Differential(t)

eqs = [D(x[1]) ~ α / (1 + x[2]) - β * x[1];
D(x[2]) ~ γ / (1 + x[1]) - δ * x[2]]

sys = ODESystem(eqs, t, x, [α, β, γ, δ], name = :Autoregulation)

x0 = [x[1] => 20.0; x[2] => 12.0]

@mtkbuild sys = Autoregulation()
tspan = (0.0, 5.0)

de_problem = ODEProblem{true, SciMLBase.NoSpecialize}(sys, x0, tspan)
de_problem = ODEProblem{true, SciMLBase.NoSpecialize}(sys, [], tspan, [])
de_solution = solve(de_problem, Tsit5(), saveat = 0.005);
#md plot(de_solution)

# As always, we start by defining a [DataDrivenProblem](@ref problem) and a sufficient basis for sparse regression.

dd_prob = DataDrivenProblem(de_solution)

x = sys.x
eqs = [polynomial_basis(x, 4); D.(x); x .* D(x[1]); x .* D(x[2])]

basis = Basis(eqs, x, independent_variable = t, implicits = D.(x))
Expand Down
2 changes: 1 addition & 1 deletion docs/src/libs/datadrivensr/example_01.jl
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
# Hence, the performance might differ and depends strongly on the hyperparameters of the optimization.
# This example might not recover the groundtruth, but is showing off the use within `DataDrivenDiffEq.jl`.
#
# DataDrivenDiffEq offers an interface to [`SymbolicRegression.jl`](https://docs.sciml.ai/SymbolicRegression/stable/) to infer more complex functions.
# DataDrivenDiffEq offers an interface to [`SymbolicRegression.jl`](https://ai.damtp.cam.ac.uk/symbolicregression/dev/) to infer more complex functions.

using DataDrivenDiffEq
using LinearAlgebra
Expand Down
2 changes: 1 addition & 1 deletion lib/DataDrivenDMD/src/algorithms.jl
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ $(TYPEDEF)


Approximates the Koopman operator `K` via the forward-backward DMD.
It is assumed that `K = sqrt(K₁*inv(K₂))`, where `K₁` is the approximation via forward and `K₂` via [DMDSVD](@ref). Based on [this paper](https://arxiv.org/pdf/1507.02264.pdf).
It is assumed that `K = sqrt(K₁*inv(K₂))`, where `K₁` is the approximation via forward and `K₂` via [DMDSVD](@ref). Based on [this paper](https://arxiv.org/pdf/1507.02264).

If `truncation` ∈ (0, 1) is given, the singular value decomposition is reduced to include only
entries bigger than `truncation*maximum(Σ)`. If `truncation` is an integer, the reduced SVD up to `truncation` is used for computation.
Expand Down
2 changes: 1 addition & 1 deletion lib/DataDrivenSparse/src/algorithms/STLSQ.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
$(TYPEDEF)
`STLSQ` is taken from the [original paper on SINDY](https://www.pnas.org/doi/10.1073/pnas.1517384113) and implements a
sequentially thresholded least squares iteration. `λ` is the threshold of the iteration.
It is based upon [this Matlab implementation](https://github.com/eurika-kaiser/SINDY-MPC/blob/master/utils/sparsifyDynamics.m).
It is based upon [this Matlab implementation](https://github.com/eurika-kaiser/SINDY-MPC/blob/e1dfd9908b2b56af303ee9fb30a133aced4fd757/utils/sparsifyDynamics.m).
It solves the following problem
```math
\\argmin_{x} \\frac{1}{2} \\| Ax-b\\|_2 + \\rho \\|x\\|_2
Expand Down
9 changes: 5 additions & 4 deletions src/problem/type.jl
Original file line number Diff line number Diff line change
Expand Up @@ -152,10 +152,11 @@ function DataDrivenProblem(X::AbstractMatrix;
DX::AbstractMatrix = Array{eltype(X)}(undef, 0, 0),
Y::AbstractMatrix = Array{eltype(X)}(undef, 0, 0),
U::F = Array{eltype(X)}(undef, 0, 0),
p::AbstractVector = Array{eltype(X)}(undef, 0),
p::Union{AbstractVector, MTKParameters} = Array{eltype(X)}(undef, 0),
probtype = nothing,
kwargs...) where {F <: Union{AbstractMatrix, Function}}
return DataDrivenProblem(probtype, X, t, DX, Y, U, p; kwargs...)
_p = isa(p, MTKParameters) ? p.tunable : p
return DataDrivenProblem(probtype, X, t, DX, Y, U, _p; kwargs...)
end

function Base.summary(io::IO, x::DataDrivenProblem{N, C, P}) where {N, C, P}
Expand Down Expand Up @@ -313,7 +314,7 @@ function ModelingToolkit.get_dvs(p::ABSTRACT_DIRECT_PROB, i = :, j = :)
end

function ModelingToolkit.get_dvs(p::ABSTRACT_DISCRETE_PROB, i = :, j = :)
ModelingToolkit.states(p, i, j)
states(p, i, j)
end

function ModelingToolkit.observed(p::AbstractDataDrivenProblem, i = :, j = :)
Expand All @@ -327,7 +328,7 @@ function ModelingToolkit.controls(p::AbstractDataDrivenProblem, i = :, j = :)
end

function Base.getindex(p::AbstractDataDrivenProblem, i = :, j = :)
return (ModelingToolkit.states(p, i, j),
return (states(p, i, j),
ModelingToolkit.parameters(p),
ModelingToolkit.independent_variable(p, j),
ModelingToolkit.controls(p, i, j))
Expand Down
6 changes: 3 additions & 3 deletions src/utils/collocation.jl
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,7 @@ $(SIGNATURES)

Unified interface for collocation techniques. The input can either be
a `CollocationKernel` (see list below) or a wrapped `InterpolationMethod` from
[DataInterpolations.jl](https://github.com/PumasAI/DataInterpolations.jl).
[DataInterpolations.jl](https://github.com/SciML/DataInterpolations.jl).

Computes a non-parametrically smoothed estimate of `u'` and `u`
given the `data`, where each column is a snapshot of the timeseries at
Expand All @@ -156,7 +156,7 @@ u′,u,t = collocate_data(data,tpoints,interp)
```

# Collocation Kernels
See [this paper](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2631937/) for more information.
See [this paper](https://pmc.ncbi.nlm.nih.gov/articles/PMC2631937/) for more information.
+ EpanechnikovKernel
+ UniformKernel
+ TriangularKernel
Expand All @@ -170,7 +170,7 @@ See [this paper](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2631937/) for more
+ SilvermanKernel

# Interpolation Methods
See [DataInterpolations.jl](https://github.com/PumasAI/DataInterpolations.jl) for more information.
See [DataInterpolations.jl](https://github.com/SciML/DataInterpolations.jl) for more information.
+ ConstantInterpolation
+ LinearInterpolation
+ QuadraticInterpolation
Expand Down
28 changes: 28 additions & 0 deletions test/problem/problem.jl
Original file line number Diff line number Diff line change
Expand Up @@ -162,3 +162,31 @@ end
prob2 = (X = X, t = t, Y = Y))
@test_throws ArgumentError ContinuousDataset(wrong_data)
end

@testset "DataDrivenProblem from ODEProblem solution" begin
using OrdinaryDiffEqTsit5
using ModelingToolkit: t_nounits as time, D_nounits as D

@mtkmodel Autoregulation begin
@parameters begin
α = 1.0
β = 1.3
γ = 2.0
δ = 0.5
end
@variables begin
(x(time))[1:2] = [20.0; 12.0]
end
@equations begin
D(x[1]) ~ α / (1 + x[2]) - β * x[1]
D(x[2]) ~ γ / (1 + x[1]) - δ * x[2]
end
end

@mtkbuild sys = Autoregulation()
tspan = (0.0, 5.0)
de_problem = ODEProblem{true, SciMLBase.NoSpecialize}(sys, [], tspan, [])
de_solution = solve(de_problem, Tsit5(), saveat = 0.005)
prob = DataDrivenProblem(de_solution)
@test is_valid(prob)
end
Loading