Skip to content

Conversation

@JiayiFeng
Copy link
Collaborator

update backward documents

@JiayiFeng JiayiFeng requested a review from dzhwinter August 28, 2017 00:38
@JiayiFeng
Copy link
Collaborator Author

JiayiFeng commented Aug 28, 2017

Something wrong in this pr. close it.

@JiayiFeng JiayiFeng closed this Aug 28, 2017
@JiayiFeng JiayiFeng reopened this Aug 28, 2017
## Motivation

In Neural Network, the backpropagation algorithm follows the chain rule, so we need to compound the fundmental gradient operators/expressions together with chain rule . Every forward network need a backward network to construct the full computation lineage, the operator/ expression's Backward feature will generate the backward pass respect to forward pass.
In Neural Network, the backpropagation algorithm follows the chain rule, so we need to compound the fundmental gradient operators/expressions together with chain rule . Every forward network need a backward network to construct the full computation lineage, the operator/expression's backward pass will be generated respect to forward pass.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lineage => graph, I should not introduce new concept into the design doc.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

| ---------------------- | ---------------- | -------------------------------- |
| **Operator::inputs_** | Inputs | Inputs, Outputs, OutputGradients |
| **Operator::outputs_** | Outputs | InputGradients |
-| | forward operator | backward operator
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

format error. - appear in the form left.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done


grad_op_builder(fengjiayi)
```cpp
REGISTER_OP(add_two, AddTwoOp, AddTwoOpMaker, add_two_grad, AddTwoGradOp);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

replace with another operator, please. Otherwise, the user may be confused with the Generic Add in backward network construction.

Copy link
Collaborator Author

@JiayiFeng JiayiFeng Aug 28, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done. use MulOp instead.

Copy link
Contributor

@dzhwinter dzhwinter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM++

@JiayiFeng JiayiFeng merged commit 794c2f2 into PaddlePaddle:develop Aug 28, 2017
@JiayiFeng JiayiFeng deleted the complete_backward_doc branch August 28, 2017 21:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants