Skip to content

RNN roadmap for refactoring #4561

@Superjomn

Description

@Superjomn

RNN is a big concept to support in our refactoring.
Based on the current support of our framework, it is better to implement RNNs in following stages:

  • recurrent_op, an RNN operator that takes a tensor as input
  • dynamic_recurrent_op, an RNN that takes variable-length sequences as input, and output sequences as output
    • should replace the recurrent_op
  • built-in beam search, that will be a method in dynamic_recurrent_op and make the dynamic_recurrent_op an equivalent of the old RecurrentGradientMachine.
  • dynamic RNN based on while_loop and some other conditional operators, after we support this, the infrastructure might apply to some other dynamic models such as Tree-LSTM and so on.
  • dynamic beam search, that is a beam search built on while_loop and other conditional operations.

Milestones

  • support neural machine translation model
    • dynamic_recurrent_op with a built-in beam search module should be ready at that point.
  • a text classification model with dynamic RNNs, the operators not limited to the following ones should be ready
    • pd.while_loop
    • pd.equals
    • pd.TensorArray
    • pd.less_than
  • machine translation model based on dynamic beam search (maybe wrapped as generator)
    • that needs more dynamic operators ready

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions