|
101 | 101 | sp.sign: paddle.sign, |
102 | 102 | sp.ceiling: paddle.ceil, |
103 | 103 | sp.floor: paddle.floor, |
104 | | - # NOTE: sp.Add and sp.Mul is not included here for un-alignment with sympy |
| 104 | + # NOTE: sp.Add and sp.Mul is not included here for un-alignment with paddle |
105 | 105 | # and are implemented manually in 'OperatorNode._add_operator_func' and |
106 | 106 | # 'OperatorNode._mul_operator_func' |
107 | 107 | } |
@@ -711,15 +711,15 @@ def lambdify( |
711 | 711 | such as 'momentum_x'. Defaults to None. |
712 | 712 | create_graph (bool, optional): Whether to create the gradient graphs of |
713 | 713 | the computing process. When it is True, higher order derivatives are |
714 | | - supported to compute; when it is False, the gradient graphs of the |
| 714 | + supported to compute. When it is False, the gradient graphs of the |
715 | 715 | computing process would be discarded. Defaults to True. |
716 | 716 | retain_graph (Optional[bool]): Whether to retain the forward graph which |
717 | 717 | is used to calculate the gradient. When it is True, the graph would |
718 | 718 | be retained, in which way users can calculate backward twice for the |
719 | 719 | same graph. When it is False, the graph would be freed. Defaults to None, |
720 | 720 | which means it is equal to `create_graph`. |
721 | 721 | fuse_derivative (bool, optional): Whether to fuse the derivative nodes. |
722 | | - for example, if `expr` is 'Derivative(u, x) + Derivative(u, y)' |
| 722 | + For example, if `expr` is 'Derivative(u, x) + Derivative(u, y)' |
723 | 723 | It will compute grad(u, x) + grad(u, y) if fuse_derivative=False, |
724 | 724 | else will compute sum(grad(u, [x, y])) if fuse_derivative=True as is more |
725 | 725 | efficient in backward-graph. Defaults to False, as it is experimental so not |
|
0 commit comments