Skip to content

Enhance performance of getting information about keywords with big remote libraries #3362

@MyMindWorld

Description

@MyMindWorld

Hello!
We have really big remote libraries (about 50.000 kw) and we stuck on performance issues

While researching it i found that when client asks for kw names, server sends them all at once, but documentation, arguments e.t.c. sending by one. So we have around 200-300k of requests and to run single case we need to wait around 3 minutes to actually start.

Seems like ok, but in real life when we need to debug smth or run pabot(which simply run allocated instance of robot) 3 minutes becomes 5 hours for 100 suites and our parallel setup becomes useless also

Maybe its possible to make request for all at once and send them also like that?

For now we simply removed this functions from XmlRpcRemoteClient class in Remote.py and it starting now in less than 1 sec)

We also researched dynamic libraries API, but it seems like its not what we need)

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions