Quick Question About graph_iterator_node.py #369
Closed
RyanGNguyen
started this conversation in
General
Replies: 2 comments 1 reply
-
Hey @RyanGNguyen thanks! We switched to shallow copies because it was eventually copying rlock objects causing problems. I will change the comment ;) |
Beta Was this translation helpful? Give feedback.
1 reply
-
please update to the new version |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I've been experimenting with using this library for a project that I'm working on and was looking through graph_iterator_node.py out of curiosity. Specifically, I saw this code block under the definition for "_async_execute(self, state: dict, batchsize: int)":
creates a deepcopy of the graph instance for each endpoint
`
for url in urls:
instance = copy.copy(graph_instance)
instance.source = url
`
The comment claims to make a deep copy of each graph instance per url, but the documentation for the copy.copy method says it only makes shallow copies unlike copy.deepcopy. I'm posting this here since I haven't experienced any explicit issues from this, but I figured this was still worth mentioning.
Beta Was this translation helpful? Give feedback.
All reactions