Closed
Description
Describe the bug 🐞
Unclear internal behavior of the first layer in NOMAD model.
Minimal Reproducible Example 👇
using Lux, NeuralOperators, Random
nomad = NOMAD(Chain(Dense(8 => 8, σ), Dense(8 => 7)),
Chain(Dense(8 => 4, σ), Dense(4 => 1)))
rng = Random.default_rng()
ps, st = Lux.setup(rng, nomad)
u = rand(Float32, 8, 10)
y = rand(Float32, 1, 10)
v = nomad.model.layers[1](u, ps.layer_1, st.layer_1)[1]
@show size(v)
Error & Stacktrace
size(v) = (15, 10)
Expected behavior
The shape of the output from the first layer is expected to be (7, 10) so that it can be concatenated to the subsequent layer.
Environment (please complete the following information):
- Output of
using Pkg; Pkg.status()
[b2108857] Lux v1.4.1
[ea5c82af] NeuralOperators v0.5.2
- Output of
versioninfo()
Julia Version 1.11.1