Skip to content

Get a quantization parameter via node-tfjs-tflite #8076

Open
@bartbutenaers

Description

@bartbutenaers

System information

  • TensorFlow.js version (you are using): 3.21.0
  • Are you willing to contribute it (Yes/No): Yes

Describe the feature and the current behavior/state.

We are already a few months stuck to implement object detection using yolov5 on a Coral tpu stick via tfjs-tflite-node, which we need for video surveillance in Node-RED. The problem is that the bounding boxes are at the wrong locations in the image, i.e. not around the detected objects. It seemed that the model contained quantization parameters, which we need to extract from the tflite model to scale the input and output tensors.

We tried to solve this via the forum first, but since we are still stuck we hope to find a solution via this Github issue. Hopefully that is ok...

I tried to change the TensorInfo class, by adding the quantization param to it:

class TensorInfo : public Napi::ObjectWrap<TensorInfo> {
  static Napi::Object Init(Napi::Env env, Napi::Object exports) {
    Napi::Function func = DefineClass(env, "TensorInfo", {
        ...
        InstanceAccessor<&TensorInfo::GetQuantizationParams>("quantizationParameter"),
        ...
   });

   Napi::Value GetQuantizationParams(const Napi::CallbackInfo& info) {
      Napi::Env env = info.Env();

      // Get the quantization parameters from the tensor
      TfLiteQuantizationParams quant_params = TfLiteTensorQuantizationParams(tensor);

      // Create a new JavaScript object to hold the parameters
      Napi::Object js_quant_params = Napi::Object::New(env);
      js_quant_params.Set("scale", Napi::Number::New(env, quant_params.scale));
      js_quant_params.Set("zero_point", Napi::Number::New(env, quant_params.zero_point));

      return js_quant_params;
   }
}

Not sure if this code is ok, because this is quite far from my comfort zone to be honest...

My next stap was to make sure the unit tests were working. However it seemed I needed to pass my new parameter first via the tflite_model.ts file:

private convertTFLiteTensorInfos(infos: TFLiteWebModelRunnerTensorInfo[]): ModelTensorInfo[] {
    return infos.map(info => {
      const dtype = getDTypeFromTFLiteType(info.dataType);
      return {
        name: info.name,
        shape: this.getShapeFromTFLiteTensorInfo(info),
        dtype,
        quantizationParameter: info.quantizationParameter,
      };
    });
  }

However that fails because the TFLiteWebModelRunnerTensorInfo interface doesn’t contain my new field. And I assume I cannot simply add it to the interface, because all backends need to support it then.

Would be nice to get some tips on how we can proceed with this.

Thanks!!!
Bart

Will this change the current api? How?

That is not clear to me, but I assume 'yes'.

Who will benefit with this feature?

Use cases with tflite models which contains quantization parameters.

Any Other info.

Metadata

Metadata

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions