-
Notifications
You must be signed in to change notification settings - Fork 5.9k
Closed
Labels
Description
RNN is a big concept to support in our refactoring.
Based on the current support of our framework, it is better to implement RNNs in following stages:
-
recurrent_op, an RNN operator that takes a tensor as input -
dynamic_recurrent_op, an RNN that takes variable-length sequences as input, and output sequences as output- should replace the
recurrent_op
- should replace the
- built-in
beam search, that will be a method indynamic_recurrent_opand make thedynamic_recurrent_opan equivalent of the oldRecurrentGradientMachine. - dynamic RNN based on
while_loopand some other conditional operators, after we support this, the infrastructure might apply to some other dynamic models such as Tree-LSTM and so on. - dynamic beam search, that is a beam search built on
while_loopand other conditional operations.
Milestones
- support neural machine translation model
dynamic_recurrent_opwith a built-in beam search module should be ready at that point.
- a text classification model with dynamic RNNs, the operators not limited to the following ones should be ready
pd.while_looppd.equalspd.TensorArraypd.less_than
- machine translation model based on dynamic beam search (maybe wrapped as
generator)- that needs more dynamic operators ready
wangkuiyi and chengduoZH