Description
System information
- TensorFlow.js version: 4.2.0
- Are you willing to contribute it: Yes
Describe the feature and the current behavior/state.
In Python it is possible to create some layers like this:
tf.keras.layers.BatchNormalization()(f, training=True)
an other example is:
u =tf.keras.layers.Dropout(0.3)(u, training=True)
Like that, the call method of the Layer will get the kwargs training=True always, also at prediction time without needing of
calling the model like this:
model(input, training=True)
When converting from Python TF to TFJS, the models does not seem to work like that, so the user is forced to call the model like this:
await this.loaded_model.apply(batched, {'training': true});
as calling it like this:
this.loaded_model.predict(batched)
does not work as expected (i.e.: the batchNormLayer still works in inference mode).
Also, models, where at least one layer has the above signature, are not convertible in graph_mode
Who will benefit with this feature?
People that use GAN with TFJS, as for most of the GAN models, batch normalization needs to work in training mode when predicting.
Any Other info.
Please comment if my point is not clear.