Skip to content

[tfjs-tflite] How to use a particular a subgraph of a tflite model? #6919

Open
@josephrocca

Description

@josephrocca

This tflite file has two subgraphs: encode, and decode. If I load the model with tflite.loadTFLiteModel and then call model.predict, it uses the decode subgraph by default (this subgraph is also shown by default when loading the model in netron.app).

I'm wondering how to use the encode subgraph? I couldn't find any docs on this, and after sleuthing through the tfjs-tflite code I wasn't able to find any config options.

I initially thought that I could just refer to the inputs/outputs of the subgraph, but that only works for the input/output of the decode subgraph:

// works:
let output = model.predict({"decode_encoding_indices:0":bits});
// doesn't work (throws error, shown below):
let output = model.predict({"encode_input_frames:0":embedding, "encode_num_quantizers:0":numQuantizers}); 

The error thrown by the latter is:

Uncaught Error: The model input names don't match the model input names. Names in input but missing in model: [encode_input_frames:0,encode_num_quantizers:0]. Names in model but missing in inputs: [decode_encoding_indices:0].
    at TFLiteModel.checkMapInputs (tf-tflite.js:10318:19)
    at TFLiteModel.predict (tf-tflite.js:10159:22)
    at quantize ((index):53:36)
    at testQuantization (<anonymous>:6:24)
    at <anonymous>:1:7

I've double-checked that I've got the input names correct.

Metadata

Metadata

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions