-
Notifications
You must be signed in to change notification settings - Fork 256
Add scores for generated text in inference mode #164
base: master
Are you sure you want to change the base?
Add scores for generated text in inference mode #164
Conversation
We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for all the commit author(s) or Co-authors. If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google. ℹ️ Googlers: Go here for more info. |
a3d42a3
to
7a96222
Compare
CLAs look good, thanks! ℹ️ Googlers: Go here for more info. |
…nsistent with other scores. create a compute_score function to remove duplicate code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, this is going to result in an approximate doubling of inference time. Can you make it so the score is computed in sample_autoregressive?
@allen-q do you plan on following up with this? thanks! |
Sorry for the late reply. I'm still planning to make it work but probably
won't have time in the next couple of weeks. Happy for someone to take a
look at it in the meantime.
…On Sat, 3 Oct 2020, 12:06 am Adam Roberts, ***@***.***> wrote:
@allen-q <https://github.com/allen-q> do you plan on following up with
this? thanks!
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#164 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AE6GK65MHIRVRNBEVT533DTSIXM7DANCNFSM4QF6TH5A>
.
|
No worries. Perhaps we can just gate this with a bool arg for now until we have the "free" version? |
I am currently working on a |
Background:
I was using the T5 model and wanted to get the scores at inference mode along with the generated text. However, this feature is not supported by T5 at the moment and I was advised to implement this feature and raise a pull request. Please see google-research/text-to-text-transfer-transformer#311. for more details.
This PR implemented this function to add the scores(log likelihood) along the generated text in the outputs when a model is exported in SavedModel format.
Changed file:
./mesh/mesh_tensorflow/transformer/utils.py
SignatureDef Diff
Below is how a T5 MTF SavedModel SignatureDef looks like before the change:
Below is how a T5 MTF SavedModel SignatureDef looks like after the change: