Cache _get_package_version to avoid heavy CPU load #817
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Overview
Changed introduced in this PR was a reason why our service ran out of CPU after the NR update.

If you have redis-py installed, but don't have aioredis, NewRelic tries to find a version of aioredis anyway every time you call your Redis client. Instead of a quick check based on importing the library, NR now uses pkg_resources to find a lib. And pkg_resources is slow :(
Because we use Redis a lot, it heavily affected the performance.
The easiest solution is to add a global cache for package versions - they won't change in runtime anyway
Related Github Issue
Include a link to the related GitHub issue, if applicable
Testing
The agent includes a suite of tests which should be used to
verify your changes don't break existing functionality. These tests will run with
Github Actions when a pull request is made. More details on running the tests locally can be found
here,
For most contributions it is strongly recommended to add additional tests which
exercise your changes.