You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
**OptimizationODE.jl** provides ODE-based optimization methods as a solver plugin for [SciML's Optimization.jl](https://github.com/SciML/Optimization.jl). It wraps various ODE solvers to perform gradient-based optimization using continuous-time dynamics.
4
+
5
+
## Installation
6
+
7
+
```julia
8
+
using Pkg
9
+
Pkg.add(url="OptimizationODE.jl")
10
+
```
11
+
12
+
## Usage
13
+
14
+
```julia
15
+
using OptimizationODE, Optimization, ADTypes, SciMLBase
16
+
17
+
functionf(x, p)
18
+
returnsum(abs2, x)
19
+
end
20
+
21
+
functiong!(g, x, p)
22
+
@. g =2* x
23
+
end
24
+
25
+
x0 = [2.0, -3.0]
26
+
p = []
27
+
28
+
f_manual =OptimizationFunction(f, SciMLBase.NoAD(); grad = g!)
29
+
prob_manual =OptimizationProblem(f_manual, x0)
30
+
31
+
opt =ODEGradientDescent(dt=0.01)
32
+
sol =solve(prob_manual, opt; maxiters=50_000)
33
+
34
+
@show sol.u
35
+
@show sol.objective
36
+
```
37
+
38
+
## Available Optimizers
39
+
40
+
*`ODEGradientDescent(dt=...)` — uses the explicit Euler method.
41
+
*`RKChebyshevDescent()` — uses the ROCK2 method.
42
+
*`RKAccelerated()` — uses the Tsit5 Runge-Kutta method.
43
+
*`HighOrderDescent()` — uses the Vern7 high-order Runge-Kutta method.
44
+
45
+
## Interface Details
46
+
47
+
All optimizers require gradient information (either via automatic differentiation or manually provided `grad!`).
48
+
49
+
### Keyword Arguments
50
+
51
+
*`dt` — time step size (only for `ODEGradientDescent`).
52
+
*`maxiters` — maximum number of ODE steps.
53
+
*`callback` — function to observe progress.
54
+
*`progress=true` — enables live progress display.
55
+
56
+
## Development
57
+
58
+
Please refer to the `runtests.jl` file for a complete set of tests that demonstrate how each optimizer is used.
0 commit comments