Skip to content

Conversation

albertomercurio
Copy link

Fixes #1695.

@albertomercurio
Copy link
Author

Apparently all the tests are failing. I can try to figure out why, but the output is quite long to investigate.

To run the tests locally, should I run them with simply Pkg.test()?

@albertomercurio
Copy link
Author

Making AnyTracedRArray a DenseArray subtype seems to fix a few examples, but it breaks many others. I don't know how to proceed.

@giordano
Copy link
Member

I don't know how to proceed.

Honestly I don't think anyone does because no one has tried doing this. The most productive way is usually to look at one of the errors and start crafting a minimal reproducer, which hopefully would make it clearer what in the dispatch (presumably) is going haywire.

@albertomercurio
Copy link
Author

Ok, just to give some informations.

  • There are 20 error when making only RArray a subtype of DenseArray
  • These errors become 70 or so when also making AnyTracedArray a subtype of DenseArray

Firs, does make sense to have AnyTracedArray <: DenseArray? Without this condition, the following code fails

sum_xxᵀ(x) = sum(x .* x')
x = rand(4,4)
x_ca = Reactant.to_rarray(x)
@jit(sum_xxᵀ(x_ca))

with error

ERROR: Scalar indexing is disallowed.
Invocation of getindex(::TracedRArray, ::Vararg{Int, N}) resulted in scalar indexing of a GPU array.
This is typically caused by calling an iterating implementation of a method.
Such implementations *do not* execute on the GPU, but very slowly on the CPU,
and therefore should be avoided.

If you want to allow scalar iteration, use `allowscalar` or `@allowscalar`
to enable scalar iteration globally or for the operations in question.
Stacktrace:
  [1] error(s::String)
    @ Base ./error.jl:35
  [2] errorscalar(op::String)
    @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:151
  [3] _assertscalar(op::String, behavior::GPUArraysCore.ScalarIndexing)
    @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:124
  [4] assertscalar(op::String)
    @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:112
  [5] getindex(::Reactant.TracedRArray{Float64, 2}, ::Int64, ::Int64)
    @ Reactant.TracedIndexing ~/.julia/dev/Reactant/src/Indexing.jl:40
  [6] getindex
    @ ~/.julia/dev/Reactant/src/Indexing.jl:48 [inlined]
  [7] _getindex
    @ ./abstractarray.jl:1341 [inlined]
  [8] getindex
    @ ./abstractarray.jl:1312 [inlined]
  [9] _permutedims!(P::PermutedDimsArray{…}, src::Reactant.TracedRArray{…}, R1::CartesianIndices{…}, R2::CartesianIndices{…}, R3::CartesianIndices{…}, ds::Int64, dp::Int64)
    @ Base.PermutedDimsArrays ./permuteddimsarray.jl:322
 [10] _copy!(P::PermutedDimsArray{Reactant.TracedRNumber{Float64}, 2, (2, 1), (2, 1), Reactant.TracedRArray{Float64, 2}}, src::Reactant.TracedRArray{Float64, 2})
    @ Base.PermutedDimsArrays ./permuteddimsarray.jl:311
 [11] permutedims!(dest::Reactant.TracedRArray{Float64, 2}, src::Reactant.TracedRArray{Float64, 2}, perm::Tuple{Int64, Int64})
    @ Base.PermutedDimsArrays ./permuteddimsarray.jl:287
 [12] permutedims(B::Reactant.TracedRArray{Float64, 2}, perm::Tuple{Int64, Int64})
    @ Base ./multidimensional.jl:1634
 [13] materialize_traced_array
    @ ~/.julia/dev/Reactant/src/stdlibs/LinearAlgebra.jl:48 [inlined]
 [14] materialize_traced_array(x::LinearAlgebra.Adjoint{Reactant.TracedRNumber{Float64}, Reactant.TracedRArray{Float64, 2}})
    @ Reactant.TracedLinearAlgebra ~/.julia/dev/Reactant/src/stdlibs/LinearAlgebra.jl:54
 [15] promote_to
    @ ~/.julia/dev/Reactant/src/TracedPromotion.jl:22 [inlined]
 [16] (::Nothing)(none::typeof(Reactant.promote_to), none::Type{…}, none::LinearAlgebra.Adjoint{…})
    @ Reactant ./<missing>:0
 [17] promote_to
    @ ~/.julia/dev/Reactant/src/TracedPromotion.jl:22 [inlined]
 [18] call_with_reactant(::typeof(Reactant.promote_to), ::Type{Reactant.TracedRArray{…}}, ::LinearAlgebra.Adjoint{Reactant.TracedRNumber{…}, Reactant.TracedRArray{…}})
    @ Reactant ~/.julia/dev/Reactant/src/utils.jl:0
 [19] promote_to
    @ ~/.julia/dev/Reactant/src/TracedPromotion.jl:8 [inlined]
 [20] broadcast_to_size
    @ ~/.julia/dev/Reactant/src/TracedPromotion.jl:94 [inlined]
 [21] broadcast_to_size
    @ ~/.julia/dev/Reactant/src/TracedPromotion.jl:112 [inlined]
 [22] (::Nothing)(none::typeof(Reactant.broadcast_to_size), none::Base.Broadcast.Extruded{LinearAlgebra.Adjoint{…}, Tuple{…}, Tuple{…}}, none::Tuple{Int64, Int64})
    @ Reactant ./<missing>:0
 [23] getproperty
    @ ./Base.jl:49 [inlined]
 [24] broadcast_to_size
    @ ~/.julia/dev/Reactant/src/TracedPromotion.jl:111 [inlined]
 [25] call_with_reactant(::typeof(Reactant.broadcast_to_size), ::Base.Broadcast.Extruded{LinearAlgebra.Adjoint{…}, Tuple{…}, Tuple{…}}, ::Tuple{Int64, Int64})
    @ Reactant ~/.julia/dev/Reactant/src/utils.jl:0
 [26] #15
    @ ./none:0 [inlined]
 [27] iterate
    @ ./generator.jl:48 [inlined]
 [28] _copyto!
    @ ~/.julia/dev/Reactant/src/TracedRArray.jl:502 [inlined]
 [29] (::Nothing)(none::typeof(Reactant.TracedRArrayOverrides._copyto!), none::Reactant.TracedRArray{…}, none::Base.Broadcast.Broadcasted{…})
    @ Reactant ./<missing>:0
 [30] getproperty
    @ ./Base.jl:49 [inlined]
 [31] size
    @ ~/.julia/dev/Reactant/src/TracedRArray.jl:248 [inlined]
 [32] axes
    @ ./abstractarray.jl:98 [inlined]
 [33] _copyto!
    @ ~/.julia/dev/Reactant/src/TracedRArray.jl:495 [inlined]
 [34] call_with_reactant(::typeof(Reactant.TracedRArrayOverrides._copyto!), ::Reactant.TracedRArray{…}, ::Base.Broadcast.Broadcasted{…})
    @ Reactant ~/.julia/dev/Reactant/src/utils.jl:0
 [35] copyto!
    @ ~/.julia/dev/Reactant/src/TracedRArray.jl:456 [inlined]
 [36] (::Nothing)(none::typeof(copyto!), none::Reactant.TracedRArray{Float64, 2}, none::Base.Broadcast.Broadcasted{Nothing, Tuple{…}, typeof(*), Tuple{…}})
    @ Reactant ./<missing>:0
 [37] copyto!
    @ ~/.julia/dev/Reactant/src/TracedRArray.jl:456 [inlined]
 [38] call_with_reactant(::typeof(copyto!), ::Reactant.TracedRArray{Float64, 2}, ::Base.Broadcast.Broadcasted{Nothing, Tuple{…}, typeof(*), Tuple{…}})
    @ Reactant ~/.julia/dev/Reactant/src/utils.jl:0
 [39] copyto!
    @ ./broadcast.jl:925 [inlined]
 [40] copy
    @ ~/.julia/dev/Reactant/src/TracedRArray.jl:447 [inlined]
 [41] (::Nothing)(none::typeof(copy), none::Base.Broadcast.Broadcasted{Reactant.TracedRArrayOverrides.AbstractReactantArrayStyle{…}, Tuple{…}, typeof(*), Tuple{…}})
    @ Reactant ./<missing>:0
 [42] call_with_reactant(::typeof(copy), ::Base.Broadcast.Broadcasted{Reactant.TracedRArrayOverrides.AbstractReactantArrayStyle{…}, Tuple{…}, typeof(*), Tuple{…}})
    @ Reactant ~/.julia/dev/Reactant/src/utils.jl:519
 [43] materialize
    @ ./broadcast.jl:872 [inlined]
 [44] sum_xxᵀ
    @ ./REPL[2]:1 [inlined]
 [45] (::Nothing)(none::typeof(sum_xxᵀ), none::Reactant.TracedRArray{Float64, 2})
    @ Reactant ./<missing>:0
 [46] Adjoint
    @ ~/.julia/juliaup/julia-1.11.7+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/adjtrans.jl:33 [inlined]
 [47] Adjoint
    @ ~/.julia/juliaup/julia-1.11.7+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/adjtrans.jl:64 [inlined]
 [48] adjoint
    @ ~/.julia/juliaup/julia-1.11.7+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/adjtrans.jl:189 [inlined]
 [49] sum_xxᵀ
    @ ./REPL[2]:1 [inlined]
 [50] call_with_reactant(::typeof(sum_xxᵀ), ::Reactant.TracedRArray{Float64, 2})
    @ Reactant ~/.julia/dev/Reactant/src/utils.jl:0
 [51] make_mlir_fn(f::typeof(sum_xxᵀ), args::Tuple{…}, kwargs::@NamedTuple{}, name::String, concretein::Bool; toscalar::Bool, return_dialect::Symbol, args_in_result::Symbol, construct_function_without_args::Bool, do_transpose::Bool, input_shardings::Nothing, output_shardings::Nothing, runtime::Val{…}, verify_arg_names::Nothing, argprefix::Symbol, resprefix::Symbol, resargprefix::Symbol, num_replicas::Int64, optimize_then_pad::Bool)
    @ Reactant.TracedUtils ~/.julia/dev/Reactant/src/TracedUtils.jl:348
 [52] compile_mlir!(mod::Reactant.MLIR.IR.Module, f::Function, args::Tuple{…}, compile_options::CompileOptions, callcache::Dict{…}, sdycache::Dict{…}; fn_kwargs::@NamedTuple{}, backend::String, runtime::Val{…}, legalize_stablehlo_to_mhlo::Bool, kwargs::@Kwargs{})
    @ Reactant.Compiler ~/.julia/dev/Reactant/src/Compiler.jl:1575
 [53] compile_mlir! (repeats 2 times)
    @ ~/.julia/dev/Reactant/src/Compiler.jl:1542 [inlined]
 [54] compile_xla(f::Function, args::Tuple{…}; before_xla_optimizations::Bool, client::Nothing, serializable::Bool, kwargs::@Kwargs{})
    @ Reactant.Compiler ~/.julia/dev/Reactant/src/Compiler.jl:3464
 [55] compile_xla
    @ ~/.julia/dev/Reactant/src/Compiler.jl:3437 [inlined]
 [56] compile(f::Function, args::Tuple{…}; kwargs::@Kwargs{})
    @ Reactant.Compiler ~/.julia/dev/Reactant/src/Compiler.jl:3536
Some type information was truncated. Use `show(err)` to see complete types.

If I make AnyTracedRArray{T,N} = DenseArray{TracedRNumber{T},N} it works. However, using a vector rather than a matrix doesn't work on both cases.

I will investigate more what is going on.

@albertomercurio
Copy link
Author

Ok I should have managed to make it work.

First, I defined AnyTracedRArray{T,N} = AbstractArray{TracedRNumber{T},N} as before. If we need it we could do it in another PR.

I'm not sure if the definitions of Base.mightalias make sense or not. We probably need to be more specific I guess. But at least everything works.

Copy link

codecov bot commented Sep 22, 2025

Codecov Report

❌ Patch coverage is 37.50000% with 5 lines in your changes missing coverage. Please review.
✅ Project coverage is 69.75%. Comparing base (7d8866e) to head (68a6d96).
⚠️ Report is 2 commits behind head on main.

Files with missing lines Patch % Lines
src/Types.jl 0.00% 4 Missing ⚠️
src/TracedRArray.jl 75.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1696      +/-   ##
==========================================
+ Coverage   68.27%   69.75%   +1.47%     
==========================================
  Files         109      110       +1     
  Lines       11597    12129     +532     
==========================================
+ Hits         7918     8460     +542     
+ Misses       3679     3669      -10     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

)
for ArrayType1 in ArrayTypesAlias
for ArrayType2 in ArrayTypesAlias
@eval Base.mightalias(::$ArrayType1, ::$ArrayType2) = false
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this definitely seems wrong since if they are equal they certainly may alias

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not familiar with the internal structure of TracedRArray. I could in principle see the implementation for Base arrays if I know where to look in TracedRArray.

:(Base.ReshapedArray{<:TracedRNumber{T},N,<:TracedRArray})
)
@eval function Base.permutedims(A::$ArrayType, perm) where {T,N}
return @opcall transpose(materialize_traced_array(A), Int64[perm...])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why doesn't anytacedrarray work?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apparently SubArray and ReshapedArray are not subtypes of AnyTracedRArray

julia> SubArray{<:TracedRNumber,1,<:TracedRArray} <: AnyTracedRArray
false

julia> SubArray{<:TracedRNumber,1,<:TracedRArray} |> supertypes
(SubArray{<:TracedRNumber, 1, <:TracedRArray}, AbstractVector{<:TracedRNumber}, Any)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cc @avik-pal i thought anytracedrarray was just an abstractarray of a traced number?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Make Reactant.RArray a subtype of DenseVector
3 participants