Skip to content
This repository was archived by the owner on Jan 21, 2025. It is now read-only.

Add probabilities for generated text in inference model #163

Closed

Conversation

allen-q
Copy link

@allen-q allen-q commented Aug 20, 2020

Background:

I was using the T5 model and wanted to get the probabilities at inference mode along with the generated text. However, this feature is not supported by T5 at the moment and I was advised to implement this feature and raise a pull request.

This PR implemented this function to add the probabilities along the generated text in the outputs when a model is exported in SavedModel format.

Changed file:
./mesh/mesh_tensorflow/transformer/utils.py

SignatureDef Diff
Below is how a T5 MTF SavedModel SignatureDef looks like before the change:

The given SavedModel SignatureDef contains the following input(s):
  inputs['input'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: inputs:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['inputs'] tensor_info:
      dtype: DT_STRING
      shape: (10)
      name: SentenceTokenizer/SentenceTokenizer/SentencepieceDetokenizeOp:0
  outputs['outputs'] tensor_info:
      dtype: DT_STRING
      shape: (10)
      name: SentenceTokenizer_1/SentenceTokenizer/SentencepieceDetokenizeOp:0
Method name is: tensorflow/serving/predict

Below is how a T5 MTF SavedModel SignatureDef looks like after the change:

The given SavedModel SignatureDef contains the following input(s):
  inputs['input'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: inputs:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['inputs'] tensor_info:
      dtype: DT_STRING
      shape: (10)
      name: SentenceTokenizer/SentenceTokenizer/SentencepieceDetokenizeOp:0
  outputs['outputs'] tensor_info:
      dtype: DT_STRING
      shape: (10)
      name: SentenceTokenizer_1/SentenceTokenizer/SentencepieceDetokenizeOp:0
  outputs['probabilities'] tensor_info:
      dtype: DT_FLOAT
      shape: (10)
      name: reshape_17/parallel_0/Reshape:0
Method name is: tensorflow/serving/predict

@googlebot
Copy link

We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for all the commit author(s) or Co-authors. If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google.
In order to pass this check, please resolve this problem and then comment @googlebot I fixed it.. If the bot doesn't comment, it means it doesn't think anything has changed.

ℹ️ Googlers: Go here for more info.

@allen-q allen-q changed the title Add probabilities for generated text in the SavedModel output Add probabilities for generated text in inference model Aug 20, 2020
@allen-q allen-q closed this Aug 20, 2020
@allen-q
Copy link
Author

allen-q commented Aug 20, 2020

Close this pull request in favour of #164 as I used an incorrect email address in a commit of this branch.

@allen-q allen-q deleted the Add_probability_in_predict_mode branch August 20, 2020 11:45
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants