Skip to content

Conversation

@apivovarov
Copy link
Contributor

I noticed that ADD, SUB, MUL and DIV operators have fused_activation_function option.

https://raw.githubusercontent.com/tensorflow/tensorflow/r1.13/tensorflow/lite/schema/schema.fbs

This PR adds code to handle fused_activation_function option in _convert_elemwise method.
I did not modify any existing tests for these 4 operators because other operators (e.g. FULLY_CONNECTED or MAX_POOL_2D with fused_activation_function option do not have any tests with RELU and RELU6.

@apivovarov
Copy link
Contributor Author

@FrozenGene Can you have a look?

@FrozenGene
Copy link
Member

Better to add test case. For example, after add, we have relu.

@icemelon icemelon added the status: need update need update based on feedbacks label Jun 14, 2019
@apivovarov
Copy link
Contributor Author

Added tests with RELU and RELU6 for ADD,SUB,MUL,DIV

@apivovarov
Copy link
Contributor Author

apivovarov commented Jun 15, 2019

I replaced data[1] test input data 0,1,2,3,4,5 with 1,2,3,4,5,6 to remove 0 from data[1] input data.
Zero caused different output results (nan vs 0) for DIV operator with RELU:
probably it is different handling of divide by zero

Not equal to tolerance rtol=1e-05, atol=1e-05
x and y nan location mismatch:
tflite (x): [[[[nan  1.  1.]]], [[[ 1.  1.  1.]]]]
tvm (y): [[[[0. 1. 1.]]], [[[1. 1. 1.]]]]

@kevinthesun kevinthesun merged commit 5050ab5 into apache:master Jun 17, 2019
@kevinthesun
Copy link
Contributor

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

status: need update need update based on feedbacks

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants