Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
45 changes: 45 additions & 0 deletions .github/workflows/CI_ecosystem.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
name: CI_ecosystem
on:
#pull_request:
# branches:
# - main
#push:
# branches:
# - main
workflow_dispatch:
jobs:
test:
name: Julia ${{ matrix.version }} - ${{ matrix.os }} - ${{ matrix.arch }} - ${{ github.event_name }}
runs-on: ${{ matrix.os }}
env:
JULIA_NUM_THREADS: 8
JULIA_EXTENDED_TESTS: true
strategy:
fail-fast: false
matrix:
version:
- '1.7.3'
os:
- ubuntu-latest
- macos-latest
- windows-latest
arch:
- x64
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@v1
with:
version: ${{ matrix.version }}
arch: ${{ matrix.arch }}
- uses: actions/cache@v1
env:
cache-name: cache-artifacts
with:
path: ~/.julia/artifacts
key: ${{ runner.os }}-test_extended-${{ env.cache-name }}-${{ hashFiles('**/Project.toml') }}
- uses: julia-actions/julia-buildpkg@v1
- uses: julia-actions/julia-runtest@v1
- uses: julia-actions/julia-processcoverage@v1
- uses: codecov/codecov-action@v2
with:
file: lcov.info
2 changes: 1 addition & 1 deletion .github/workflows/CI_extended.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: CI
name: CI_extended
on:
pull_request:
branches:
Expand Down
1 change: 1 addition & 0 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ version = "0.1.0"

[deps]
ChainRules = "082447d4-558c-5d27-93f4-14fc19e9eca2"
ChainRulesCore = "d360d2e6-b24c-11e9-a2a3-2a2ae2dbcce4"
DataFrames = "a93c6f00-e57d-5684-b7b6-d8193f3e46c0"
DelimitedFiles = "8bb1440f-4735-579b-a4ab-409b98df4dab"
Distributions = "31c24e10-a181-5473-b8eb-7969acd0382f"
Expand Down
6 changes: 4 additions & 2 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,13 @@ makedocs(
"tutorials/specification/specification.md",
"tutorials/specification/graph_interface.md",
"tutorials/specification/ram_matrices.md",
"tutorials/specification/parameter_table.md"],
"tutorials/specification/parameter_table.md"
],
"Model Construction" => [
"tutorials/construction/construction.md",
"tutorials/construction/outer_constructor.md",
"tutorials/construction/build_by_parts.md"],
"tutorials/construction/build_by_parts.md"
],
"Optimization Backends" => [
"tutorials/backends/optim.md",
"tutorials/backends/nlopt.md"
Expand Down
4 changes: 3 additions & 1 deletion docs/src/developer/loss.md
Original file line number Diff line number Diff line change
Expand Up @@ -267,4 +267,6 @@ model_ml = SemFiniteDiff(
)

model_fit = sem_fit(model_ml)
```
```

If you want to differentiate your own loss functions via automatic differentiation, check out the [AutoDiffSEM](https://github.com/StructuralEquationModels/AutoDiffSEM) package (spoiler allert: it's really easy).
9 changes: 3 additions & 6 deletions docs/src/developer/sem.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,17 @@
# Custom model types

The abstract supertype for all models is `AbstractSem`, which has two subtypes, `AbstractSemSingle{O, I, L, D}` and `AbstractSemCollection`. Currently, there are three subtypes of `AbstractSemSingle`: `Sem`, `SemFiniteDiff` and `SemForwardDiff`. All subtypes of `AbstractSemSingle` should have at least observed, imply, loss and optimizer fields, and share their types (`{O, I, L, D}`) with the parametric abstract supertype. For example, the `SemFiniteDiff` type is implemented as
The abstract supertype for all models is `AbstractSem`, which has two subtypes, `AbstractSemSingle{O, I, L, D}` and `AbstractSemCollection`. Currently, there are 2 subtypes of `AbstractSemSingle`: `Sem`, `SemFiniteDiff`. All subtypes of `AbstractSemSingle` should have at least observed, imply, loss and optimizer fields, and share their types (`{O, I, L, D}`) with the parametric abstract supertype. For example, the `SemFiniteDiff` type is implemented as

```julia
struct SemFiniteDiff{
O <: SemObserved,
I <: SemImply,
L <: SemLoss,
D <: SemOptimizer,
G} <: AbstractSemSingle{O, I, L, D}
D <: SemOptimizer} <: AbstractSemSingle{O, I, L, D}
observed::O
imply::I
loss::L
optimizer::D
has_gradient::G
end
optimizer::Dend
```

Additionally, we need to define a method to compute at least the objective value, and if you want to use gradient based optimizers (which you most probably will), we need also to define a method to compute the gradient. For example, the respective fallback methods for all `AbstractSemSingle` models are defined as
Expand Down
1 change: 0 additions & 1 deletion docs/src/internals/types.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@ The type hierarchy is implemented in `"src/types.jl"`.
- `AbstractSemSingle{O, I, L, D} <: AbstractSem` is an abstract parametric type that is a supertype of all single models
- `Sem`: models that do not need automatic differentiation or finite difference approximation
- `SemFiniteDiff`: models whose gradients and/or hessians should be computed via finite difference approximation
- `SemForwardDiff`: models whose gradients and/or hessians should be computed via forward mode automatic differentiation
- `AbstractSemCollection <: AbstractSem` is an abstract supertype of all models that contain multiple `AbstractSem` submodels

Every `AbstractSemSingle` has to have `SemObserved`, `SemImply`, `SemLoss` and `SemOptimizer` fields (and can have additional fields).
Expand Down
2 changes: 1 addition & 1 deletion docs/src/tutorials/collection/collection.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model_1 = Sem(...)

model_2 = SemFiniteDiff(...)

model_3 = SemForwardDiff(...)
model_3 = Sem(...)

model_ensemble = SemEnsemble(model_1, model_2, model_3; optimizer = ...)
```
Expand Down
6 changes: 0 additions & 6 deletions docs/src/tutorials/construction/build_by_parts.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,10 +64,4 @@ optimizer = SemOptimizerOptim()
# model ------------------------------------------------------------------------------------

model_ml = Sem(observed, imply_ram, loss_ml, optimizer)
```

Different models may need additional arguments (just check the help of the specific model types):

```@example build
model_ml_fd = SemFiniteDiff(observed, imply_ram, loss_ml, optimizer, Val(false))
```
29 changes: 5 additions & 24 deletions docs/src/tutorials/construction/outer_constructor.md
Original file line number Diff line number Diff line change
Expand Up @@ -115,37 +115,18 @@ Extended help is available with `??`

## Optimize loss functions without analytic gradient

For loss functions without analytic gradients, it is possible to use finite difference approximation or forward mode automatic differentiation.
For loss functions without analytic gradients, it is possible to use finite difference approximation or automatic differentiation.
All loss functions provided in the package do have analytic gradients (and some even hessians or approximations thereof), so there is no need do use this feature if you are only working with them.
However, if you implement your own loss function, you do not have to provide analytic gradients.
In that case, you may construct your model just as before, but swap the `Sem` constructor for either `SemFiniteDiff` or `SemForwardDiff`. For example
This page is a about finite difference approximation. For information about how to use automatic differentiation, see the documentation of the [AutoDiffSEM](https://github.com/StructuralEquationModels/AutoDiffSEM) package.

```julia
model = SemFiniteDiff(
specification = partable,
data = data
)
```

constructs a model that will use finite difference approximation if you estimate the parameters via `sem_fit(model)`.
Both `SemFiniteDiff` and `SemForwardDiff` have an additional keyword argument, `has_gradient = ...` that can be set to `true` to indicate that the model has analytic gradients, and only the hessian should be computed via finite difference approximation / automatic differentiation.
For example
To use finite difference approximation, you may construct your model just as before, but swap the `Sem` constructor for `SemFiniteDiff`. For example

```julia
using Optim, LineSearches

model = SemFiniteDiff(
specification = partable,
data = data,
has_gradient = true,
algorithm = Newton()
data = data
)
```

will construct a model that, when fitted, will use [Newton's Method](https://julianlsolvers.github.io/Optim.jl/stable/#algo/newton/) from the `Optim.jl` package with gradients computed analytically and hessians computed via finite difference approximation.


!!! note "Using automatic differentiation"
You can construct a `SemForwardDiff` to use forward-mode automatic differentiation for the gradients.
However, at the moment, this does not work with the imply types in our package
(e.g. it only works with models that use `ImplyEmpty`).
constructs a model that will use finite difference approximation if you estimate the parameters via `sem_fit(model)`.
6 changes: 3 additions & 3 deletions src/StructuralEquationModels.jl
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ using LinearAlgebra, Optim,
NLSolversBase, Statistics, SparseArrays, Symbolics,
NLopt, FiniteDiff, ForwardDiff, PrettyTables,
Distributions, StenoGraphs, LazyArtifacts, DelimitedFiles,
DataFrames
DataFrames, Zygote, ChainRulesCore

import DataFrames: DataFrame
export *, ==, @StenoGraph, AbstractEdge, AbstractNode, DirectedEdge, Edge, EdgeModifier,
Expand Down Expand Up @@ -82,10 +82,10 @@ include("frontend/fit/standard_errors/bootstrap.jl")


export AbstractSem,
AbstractSemSingle, AbstractSemCollection, Sem, SemFiniteDiff, SemForwardDiff,
AbstractSemSingle, AbstractSemCollection, Sem, SemFiniteDiff,
SemEnsemble,
SemImply,
RAMSymbolic, RAM, ImplyEmpty, imply,
RAMSymbolic, RAMSymbolicZ, RAM, ImplyEmpty, imply,
start_val,
start_fabin3, start_simple, start_parameter_table,
SemLoss,
Expand Down
10 changes: 5 additions & 5 deletions src/additional_functions/start_val/start_fabin3.jl
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ function start_fabin3 end

# splice model and loss functions
function start_fabin3(
model::Union{Sem, SemForwardDiff, SemFiniteDiff};
model::AbstractSemSingle;
kwargs...)
return start_fabin3(
model.observed,
Expand All @@ -20,20 +20,20 @@ end

function start_fabin3(
observed,
imply::Union{RAM, RAMSymbolic},
imply,
optimizer,
args...;
kwargs...)
return start_fabin3(
imply.ram_matrices,
ram_matrices(imply),
obs_cov(observed),
obs_mean(observed))
end

# SemObservedMissing
function start_fabin3(
observed::SemObservedMissing,
imply::Union{RAM, RAMSymbolic},
imply,
optimizer,
args...;
kwargs...)
Expand All @@ -43,7 +43,7 @@ function start_fabin3(
end

return start_fabin3(
imply.ram_matrices,
ram_matrices(imply),
observed.em_model.Σ,
observed.em_model.μ)
end
Expand Down
6 changes: 3 additions & 3 deletions src/additional_functions/start_val/start_partable.jl
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Return a vector of starting values taken from `parameter_table`.
function start_parameter_table end

# splice model and loss functions
function start_parameter_table(model::Union{Sem, SemForwardDiff, SemFiniteDiff}; kwargs...)
function start_parameter_table(model::AbstractSemSingle; kwargs...)
return start_parameter_table(
model.observed,
model.imply,
Expand All @@ -16,9 +16,9 @@ function start_parameter_table(model::Union{Sem, SemForwardDiff, SemFiniteDiff};
end

# RAM(Symbolic)
function start_parameter_table(observed, imply::Union{RAM, RAMSymbolic}, optimizer, args...; kwargs...)
function start_parameter_table(observed, imply, optimizer, args...; kwargs...)
return start_parameter_table(
imply.ram_matrices;
ram_matrices(imply);
kwargs...)
end

Expand Down
4 changes: 2 additions & 2 deletions src/additional_functions/start_val/start_simple.jl
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Return a vector of simple starting values.
function start_simple end

# Single Models ----------------------------------------------------------------------------
function start_simple(model::Union{Sem, SemForwardDiff, SemFiniteDiff}; kwargs...)
function start_simple(model::AbstractSemSingle; kwargs...)
return start_simple(
model.observed,
model.imply,
Expand All @@ -25,7 +25,7 @@ function start_simple(model::Union{Sem, SemForwardDiff, SemFiniteDiff}; kwargs..
kwargs...)
end

function start_simple(observed, imply::Union{RAM, RAMSymbolic}, optimizer, args...; kwargs...)
function start_simple(observed, imply, optimizer, args...; kwargs...)
return start_simple(imply.ram_matrices; kwargs...)
end

Expand Down
4 changes: 2 additions & 2 deletions src/additional_functions/start_val/start_val.jl
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ function start_val end
# Single Models ----------------------------------------------------------------------------

# splice model and loss functions
start_val(model::Union{Sem, SemFiniteDiff, SemForwardDiff}; kwargs...) =
start_val(model::AbstractSemSingle; kwargs...) =
start_val(
model,
model.observed,
Expand All @@ -22,7 +22,7 @@ start_val(model::Union{Sem, SemFiniteDiff, SemForwardDiff}; kwargs...) =
start_val(
model,
observed,
imply::Union{RAM, RAMSymbolic},
imply,
optimizer,
args...;
kwargs...) =
Expand Down
34 changes: 1 addition & 33 deletions src/frontend/specification/Sem.jl
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,6 @@ function SemFiniteDiff(;
imply::I = RAM,
loss::L = SemML,
optimizer::D = SemOptimizerOptim,
has_gradient = false,
kwargs...) where {O, I, L, D}

kwargs = Dict{Symbol, Any}(kwargs...)
Expand All @@ -34,30 +33,11 @@ function SemFiniteDiff(;

observed, imply, loss, optimizer = get_fields!(kwargs, observed, imply, loss, optimizer)

sem = SemFiniteDiff(observed, imply, loss, optimizer, Val(has_gradient))
sem = SemFiniteDiff(observed, imply, loss, optimizer)

return sem
end

function SemForwardDiff(;
observed::O = SemObservedData,
imply::I = RAM,
loss::L = SemML,
optimizer::D = SemOptimizerOptim,
has_gradient = false,
kwargs...) where {O, I, L, D}

kwargs = Dict{Symbol, Any}(kwargs...)

set_field_type_kwargs!(kwargs, observed, imply, loss, optimizer, O, I, D)

observed, imply, loss, optimizer = get_fields!(kwargs, observed, imply, loss, optimizer)

sem = SemForwardDiff(observed, imply, loss, optimizer, Val(has_gradient))

return sem
end

############################################################################################
# functions
############################################################################################
Expand Down Expand Up @@ -162,18 +142,6 @@ function Base.show(io::IO, sem::SemFiniteDiff{O, I, L, D}) where {O, I, L, D}
print(io, " optimizer: $(nameof(D)) \n")
end

function Base.show(io::IO, sem::SemForwardDiff{O, I, L, D}) where {O, I, L, D}
lossfuntypes = @. string(nameof(typeof(sem.loss.functions)))
lossfuntypes = " ".*lossfuntypes.*("\n")
print(io, "Structural Equation Model : Forward Mode Autodiff\n")
print(io, "- Loss Functions \n")
print(io, lossfuntypes...)
print(io, "- Fields \n")
print(io, " observed: $(nameof(O)) \n")
print(io, " imply: $(nameof(I)) \n")
print(io, " optimizer: $(nameof(D)) \n")
end

function Base.show(io::IO, loss::SemLoss)
lossfuntypes = @. string(nameof(typeof(loss.functions)))
lossfuntypes = " ".*lossfuntypes.*("\n")
Expand Down
2 changes: 2 additions & 0 deletions src/imply/RAM/generic.jl
Original file line number Diff line number Diff line change
Expand Up @@ -315,6 +315,8 @@ I_A⁻¹(imply::RAM) = imply.I_A⁻¹ # only for gradient available!

has_meanstructure(imply::RAM) = imply.has_meanstructure

ram_matrices(imply::RAM) = imply.ram_matrices

############################################################################################
### additional functions
############################################################################################
Expand Down
2 changes: 2 additions & 0 deletions src/imply/RAM/symbolic.jl
Original file line number Diff line number Diff line change
Expand Up @@ -262,6 +262,8 @@ end

has_meanstructure(imply::RAMSymbolic) = imply.has_meanstructure

ram_matrices(imply::RAMSymbolic) = imply.ram_matrices

############################################################################################
### additional functions
############################################################################################
Expand Down
Loading