Skip to content

Fix removal of nodes ingested by multiple downstream nodes #544

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 23, 2022

Conversation

jmduarte
Copy link
Member

@jmduarte jmduarte commented May 6, 2022

  • Currently, when a node is removed (such as a linear activation), it is only replaced correctly in the first downstream node that uses it
  • This PR replaces it in all downstream nodes

To do

  • Add a pytest

@nickshey

@jmduarte jmduarte requested a review from vloncar May 6, 2022 04:36
@jmduarte jmduarte added the bug label May 6, 2022
@jmduarte jmduarte self-assigned this May 6, 2022
@jmduarte jmduarte changed the title Fix removal nodes that are ingested by multiple downstream nodes Fix removal of nodes ingested by multiple downstream nodes May 6, 2022
@thesps
Copy link
Contributor

thesps commented May 6, 2022

Can we get a new pytest for this?

This seems all good when the prev_node itself has a single output that is consumed multiple times, but brings us a bit closer to needing to deal with a prev_node that has multiple outputs (I think only Clone has that right now) since we always reconnect to prev_node.outputs[0].

Edit: I pushed the branch to local pr/544 to trigger pytests CI. The failing tests are unrelated to this PR.

@jmduarte jmduarte linked an issue May 6, 2022 that may be closed by this pull request
@thesps thesps merged commit 38ae1ae into fastmachinelearning:master May 23, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Lost Layers during compilation
2 participants