Description
(This issue is based on this comment: #1356 (comment))
New versions of our own code are kept up-to-date on the prod and staging "automation" servers by our old friend https://github.com/cmu-delphi/github-deploy-repo (🧄), which does not appear to have any support for maintaining requirements/dependencies (much less their versions). As such, installation and versioning for dependencies on those machines seems to only be done purposefully and manually by someone with proper insight and permissions/credentials.
We should try to keep our production and development environments in sync so that we don't run into inconsistent and confusing situations caused by differing versions. We have talked about this a number of times before for various reasons, but i don't think we ever came to a consensus. Some possibilities for addressing it include:
- Run our production automation jobs out of our prebuilt docker containers instead of on "bare hardware". They will then have the same versions of requirements/dependencies as are in dev. This might be "the right way to do things" but it sounds complicated and is probably a longer-term project. It is worth noting that our API/web servers currently run in containers.
- Add a
pip
step like this to the end of any github-deploy-repo runs. This sounds easy(ish?) to do, though there could be unforeseen consequences. It is also a temporary band-aid, since g-d-r is already in the crosshairs in Move acquisition deployment off of github-deploy-repo #1042. - Maybe get ansible to do it?
Other ideas and suggestions are welcome!