Skip to content

Keras 3.12.0

Latest

Choose a tag to compare

@fchollet fchollet released this 27 Oct 20:22
· 24 commits to master since this release
adbfd13

Highlights

Keras has a new model distillation API!

You now have access to an easy-to-use API for distilling large models into small models while minimizing performance drop on a reference dataset -- compatible with all existing Keras models. You can specify a range of different distillation losses, or create your own losses. The API supports multiple concurrent distillation losses at the same time.

Example:

# Load a model to distill
teacher = ...
# This is the model we want to distill it into
student = ...

# Configure the process
distiller = Distiller(
    teacher=teacher,
    student=student,
    distillation_losses=LogitsDistillation(temperature=3.0),
)
distiller.compile(
    optimizer='adam',
    loss='sparse_categorical_crossentropy',
    metrics=['accuracy']
)

# Train the distilled model
distiller.fit(x_train, y_train, epochs=10)

Keras supports GPTQ quantization!

GPTQ is now built into the Keras API. GPTQ is a post-training, weights-only quantization method that compresses a model to int4 layer by layer. For each layer, it uses a second-order method to update weights while minimizing the error on a calibration dataset.

Learn how to use it in this guide.

Example:

model = keras_hub.models.Gemma3CausalLM.from_preset("gemma3_1b")
gptq_config = keras.quantizers.GPTQConfig(
    dataset=calibration_dataset,
    tokenizer=model.preprocessor.tokenizer,
    weight_bits=4,
    group_size=128,
    num_samples=256,
    sequence_length=256,
    hessian_damping=0.01,
    symmetric=False,
    activation_order=False,
)
model.quantize("gptq", config=gptq_config)
outputs = model.generate(prompt, max_length=30)

Better support for Grain datasets!

  • Add Grain support to keras.utils.image_dataset_from_directory and keras.utils.text_dataset_from_directory. Specify format="grain" to return a Grain dataset instead of a TF dataset.
  • Make almost all Keras preprocessing layers compatible with Grain datasets.

New features

  • Add keras.layers.ReversibleEmbedding layer: an embedding layer that can also also project backwards to the input space. Use it with the reverse argument in call().
  • Add argument opset_version in model.export(). Argument specific to format="onnx"; specifies the ONNX opset version.
  • Add keras.ops.isin op.
  • Add keras.ops.isneginf, keras.ops.isposinf ops.
  • Add keras.ops.isreal op.
  • Add keras.ops.cholesky_inverse op and add upper argument in keras.ops.cholesky.
  • Add keras.ops.image.scale_and_translate op.
  • Add keras.ops.hypot op.
  • Add keras.ops.gcd op.
  • Add keras.ops.kron op.
  • Add keras.ops.logaddexp2 op.
  • Add keras.ops.view op.
  • Add keras.ops.unfold op.
  • Add keras.ops.jvp op.
  • Add keras.ops.trapezoid op.
  • Add support for over 20 news ops with the OpenVINO backend.

Breaking changes

  • Layers StringLookup & IntegerLookup now save vocabulary loaded from file. Previously, when instantiating these layers from a vocabulary filepath, only the filepath would be saved when saving the layer. Now, the entire vocabulary is materialized and saved as part of the .keras archive.

Security fixes

New Contributors

Full Changelog: v3.11.0...v3.12.0