Skip to content

Commit

Permalink
Drop compatibility with pre-LTS versions (#174)
Browse files Browse the repository at this point in the history
* Drop compatibility with pre-LTS versions

* Update README

* Use StableRNGs.jl to stabilize random seeds in tests across Julia versions
  • Loading branch information
adrhill authored Oct 10, 2024
1 parent b833691 commit cf65291
Show file tree
Hide file tree
Showing 14 changed files with 14 additions and 27 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ jobs:
version:
- 'lts'
- '1'
# - 'pre' # TODO: tests on pre-release version
- 'pre'
steps:
- uses: actions/checkout@v4
- uses: julia-actions/setup-julia@v2
Expand Down
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,4 +22,4 @@ Reexport = "1"
Statistics = "<0.0.1, 1"
XAIBase = "4"
Zygote = "0.6"
julia = "1.6"
julia = "1.10"
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ and [iNNvestigate][innvestigate-repo] for Keras models.
[^1]: More specifically, models currently have to be differentiable with [Zygote.jl](https://github.com/FluxML/Zygote.jl).

## Installation
This package supports Julia ≥1.6. To install it, open the Julia REPL and run
This package supports Julia ≥1.10. To install it, open the Julia REPL and run
```julia-repl
julia> ]add ExplainableAI
```
Expand Down
1 change: 0 additions & 1 deletion src/ExplainableAI.jl
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@ using DifferentiationInterface: value_and_pullback
using Zygote
const DEFAULT_AD_BACKEND = AutoZygote()

include("compat.jl")
include("bibliography.jl")
include("input_augmentation.jl")
include("gradient.jl")
Expand Down
12 changes: 0 additions & 12 deletions src/compat.jl

This file was deleted.

2 changes: 1 addition & 1 deletion test/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,6 @@ JuliaFormatter = "98e50ef6-434e-11e9-1051-2b60c6c9e899"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
ReferenceTests = "324d217c-45ce-50fc-942e-d289b448e8cf"
Suppressor = "fd094767-a336-5f1f-9728-57cf17d0bbfb"
StableRNGs = "860ef19b-820b-49d6-a774-d7a799459cd3"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
XAIBase = "9b48221d-a747-4c1b-9860-46a1d8ba24a7"
Binary file modified test/references/cnn/GradCAM_max.jld2
Binary file not shown.
Binary file modified test/references/cnn/Gradient_max.jld2
Binary file not shown.
Binary file modified test/references/cnn/InputTimesGradient_max.jld2
Binary file not shown.
Binary file modified test/references/cnn/IntegratedGradients_max.jld2
Binary file not shown.
Binary file modified test/references/cnn/SmoothGrad_max.jld2
Binary file not shown.
5 changes: 2 additions & 3 deletions test/runtests.jl
Original file line number Diff line number Diff line change
Expand Up @@ -6,14 +6,13 @@ using Aqua
using JET

@testset "ExplainableAI.jl" begin
if VERSION >= v"1.10"
@info "Testing formalities..."
@testset verbose = true "Linting" begin
@testset "Code formatting" begin
@info "- running JuliaFormatter code formatting tests..."
@test JuliaFormatter.format(ExplainableAI; verbose=false, overwrite=false)
end
@testset "Aqua.jl" begin
@info "- running Aqua.jl tests. These might print warnings from dependencies..."
@info "- running Aqua.jl tests..."
Aqua.test_all(ExplainableAI; ambiguities=false)
end
@testset "JET tests" begin
Expand Down
8 changes: 4 additions & 4 deletions test/test_batches.jl
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ using Flux
using Random
using Distributions: Laplace

pseudorand(dims...) = rand(MersenneTwister(123), Float32, dims...)
pseudorand(dims...) = rand(StableRNG(123), Float32, dims...)

## Test `fuse_batchnorm` on Dense and Conv layers
ins = 20
Expand All @@ -15,16 +15,16 @@ batchsize = 15
model = Chain(Dense(ins, 15, relu; init=pseudorand), Dense(15, outs, relu; init=pseudorand))

# Input 1 with batch dimension
input1 = rand(MersenneTwister(1), Float32, ins, 1)
input1 = rand(StableRNG(1), Float32, ins, 1)
# Input 2 with batch dimension
input2 = rand(MersenneTwister(2), Float32, ins, 1)
input2 = rand(StableRNG(2), Float32, ins, 1)
# Batch containing inputs 1 & 2
input_batch = cat(input1, input2; dims=2)

ANALYZERS = Dict(
"Gradient" => Gradient,
"InputTimesGradient" => InputTimesGradient,
"SmoothGrad" => m -> SmoothGrad(m, 5, 0.1, MersenneTwister(123)),
"SmoothGrad" => m -> SmoothGrad(m, 5, 0.1, StableRNG(123)),
"IntegratedGradients" => m -> IntegratedGradients(m, 5),
"GradCAM" => m -> GradCAM(m[1], m[2]),
)
Expand Down
7 changes: 4 additions & 3 deletions test/test_cnn.jl
Original file line number Diff line number Diff line change
Expand Up @@ -4,21 +4,22 @@ using ReferenceTests

using Flux
using Random
using StableRNGs: StableRNG
using JLD2

const GRADIENT_ANALYZERS = Dict(
"InputTimesGradient" => InputTimesGradient,
"SmoothGrad" => m -> SmoothGrad(m, 5, 0.1, MersenneTwister(123)),
"SmoothGrad" => m -> SmoothGrad(m, 5, 0.1, StableRNG(123)),
"IntegratedGradients" => m -> IntegratedGradients(m, 5),
"GradCAM" => m -> GradCAM(m[1], m[2]),
)

pseudorand(dims...) = rand(MersenneTwister(123), Float32, dims...)
pseudorand(dims...) = rand(StableRNG(123), Float32, dims...)

input_size = (32, 32, 3, 1)
input = pseudorand(input_size)

init(dims...) = Flux.glorot_uniform(MersenneTwister(123), dims...)
init(dims...) = Flux.glorot_uniform(StableRNG(123), dims...)

model = Chain(
Chain(
Expand Down

0 comments on commit cf65291

Please sign in to comment.