Skip to content

Add flop counter hook for rnn, gru and lstm #38

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Apr 22, 2020

Conversation

BCJuan
Copy link
Contributor

@BCJuan BCJuan commented Apr 21, 2020

Hello,

First of all: very nice repo and application. I am user.

I am using now RNNs, so I thought that I could expand the functionality for those. I have added a hook and a function for computing the flops. I think the params are automatically added by your already made computations and functionalities.

I have checked its functionality, but I would be more than glad if you would take a look at it.

Thank you very much!

@sovrasov
Copy link
Owner

Hi!. Thanks for that contribution. RNNs support is really missing in this project.

@BCJuan
Copy link
Contributor Author

BCJuan commented Apr 22, 2020

If anything, just let me know. Glad to help. If there is something wrong with the RNN flop computation, please do not hesitate to tell me, I will be glad to know it, since I am using it for my own purposes.

Copy link
Owner

@sovrasov sovrasov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Now the hook seems to be OK. Let's merge it and then I'm going to update the readme and resolve the shape inconsistency problem between RNNs and CNNs (LNH vs NCHW): currently without using
the input_constructor argument the first dimension of the input is forced to 1.

@sovrasov sovrasov merged commit 5ef03d1 into sovrasov:master Apr 22, 2020
@BCJuan
Copy link
Contributor Author

BCJuan commented Apr 22, 2020

I have taken into account that RNNs were built as NLH, contrary to pytorch standard scheme; I mean, change it if you would like to. It is stated under the header of the function. If you want me to do something, do not doubt to ask :)

@sovrasov
Copy link
Owner

sovrasov commented Apr 22, 2020

It seems to me this code is N-L transition agnostic, so here's nothing to change. Now I have to fix some common things related to the whole project and I need some time to figure it out.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants