-
Notifications
You must be signed in to change notification settings - Fork 2.4k
Enhance performance of getting information about keywords with big remote libraries #3362
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Sending all information in one go sounds like a good idea, and is definitely possible, but would need someone to implement it. I can help with the design but don't have time for implementation in the near future. Are you interested? Once this feature would be added to the Remote API, it then needed to be added to various remote servers before actually helping. Alternatively, or additionally, we could add a feature to the Remote library to disable querying other keyword information than names and possibly arguments. It would probably be easier to implement and wouldn't require changes to remote servers. Depending on the context, of some of this information may be needed during execution (especially argument names and types), so this wouldn't be as good solution than getting all information in one request. |
Yeah, im definately interested in this! Disable querying is what we have done already, for now seems working ok, but it is a bit cutting functions (as you mentioned). I will fork project and start developing for now then According to xmlrpcclient documentation it supports dictionaries, so for now i think send data as But how can we pass variable for remote library? As arg when starting robot? Or as arg when defining |
We can easily pass arguments to
|
I was thinking this a bit more and realized it would be fine to implement this support only to the remote interface, not to the dynamic API. Robot communicates with the dynamic API on Python level and I doubt there are problems even if the lib would have lot of keywords. The problem is several orders of magnitudes bigger when API calls end up going over HTTP. If we only implement this in the Remote API, the needed work is a lot smaller. |
@MyMindWorld , @pekkaklarck I just found your discussion on performance issue for big library case. |
Let's see can we still get this into RF 4.0. I was planning to make a release candidate already today so it's pretty late, but I understand this would be a really important enhancement. |
This enhances performance of getting information about keywords with big remote libraries considerably. See issue #3362 for more information.
Thanks to PR #3802 by @JFoederer this enhancement has been done! I did some cleanup in the commits above and still need to enhance documentation a bit. Obviously remote servers need to be updated as well before this enhancement really brings benefits. |
RoboCon sprints are running today and we just decided to talk about Python remote servers there at 12:00 UTC. Sprints are organized on Gather and you can access them via https://venue.robocon.io. This is very late notice but it would be great if you could make it there @JFoederer! |
Hello!
We have really big remote libraries (about 50.000 kw) and we stuck on performance issues
While researching it i found that when client asks for kw names, server sends them all at once, but documentation, arguments e.t.c. sending by one. So we have around 200-300k of requests and to run single case we need to wait around 3 minutes to actually start.
Seems like ok, but in real life when we need to debug smth or run pabot(which simply run allocated instance of robot) 3 minutes becomes 5 hours for 100 suites and our parallel setup becomes useless also
Maybe its possible to make request for all at once and send them also like that?
For now we simply removed this functions from XmlRpcRemoteClient class in Remote.py and it starting now in less than 1 sec)
We also researched dynamic libraries API, but it seems like its not what we need)
The text was updated successfully, but these errors were encountered: