You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Mar 20, 2026. It is now read-only.
I have a fairly simple working implementation. I need clarity on the API, shall I modify the current translate method to accept list of sentences, making the end user to give a list like ["sentence to be translated"] as input?
🚀 Feature
Support Batched inference for hub model inference.
Motivation
I am using hub models for paraphrase generation. The translation rate is slow without batching, even on GPUs.
Pitch
Allow batched sentences in https://github.com/pytorch/fairseq/blob/master/fairseq/hub_utils.py
Question
I have a fairly simple working implementation. I need clarity on the API, shall I modify the current translate method to accept list of sentences, making the end user to give a list like
["sentence to be translated"]as input?Example Code
https://gist.github.com/sai-prasanna/d4b280ca171024b9114bbb631d0d32b9