-
-
Notifications
You must be signed in to change notification settings - Fork 10.5k
cleanup: remove adapter commons #25045
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cleanup: remove adapter commons #25045
Conversation
We no longer need this abstraction after deleting prompt adapters. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request is a solid refactoring that simplifies the LoRA management logic by removing the generic adapter_commons
helper layer and inlining its functionality directly into LoRAModelManager
and WorkerLoRAManager
. This change significantly improves code clarity and maintainability by removing a layer of abstraction that was specific to LoRA anyway. The logic from the helper functions has been correctly preserved in the new inlined implementations, and the code is now more straightforward and easier to follow. I've reviewed the changes and found no issues; this is a great improvement.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Codex Review: Here are some suggestions.
Reply with @codex fix comments
to fix any unresolved comments.
About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".
def set_active_adapters(self, requests: set[Any], | ||
mapping: Optional[Any]) -> None: | ||
set_active_adapters_worker(requests, mapping, self._apply_adapters, | ||
self._adapter_manager.set_adapter_mapping) | ||
self._apply_adapters(requests) | ||
if mapping is not None: | ||
self._adapter_manager.set_adapter_mapping(mapping) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[P1] Failing to clear adapter mapping when disabled
The new worker manager now calls _adapter_manager.set_adapter_mapping
only when mapping is not None
. Previously the helper always invoked the setter even for None
, which resets any stale mapping in the model. With the current logic, calling set_active_adapters(requests, None)
leaves whatever mapping was previously set, so a request that intends to disable or clear LoRA mappings will continue to use the old mapping and potentially apply outdated adapters. This can cause incorrect adapter application across requests or after removing adapters. Consider always delegating the call, including when mapping
is None
, so the manager can reset its internal state.
Useful? React with 👍 / 👎.
@codex fix comments |
Summary
Testing
|
Signed-off-by: Jee Jee Li <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for cleanup, tested on my locall device:
===================================================================== test session starts =====================================================================
platform linux -- Python 3.10.15, pytest-8.3.3, pluggy-1.5.0
rootdir: /root/Code/vllm_main/vllm
configfile: pyproject.toml
plugins: shard-0.1.2, rerunfailures-14.0, mock-3.14.0, buildkite-test-collector-0.1.9, forked-1.6.0, anyio-4.6.2.post1, asyncio-0.24.0
asyncio: mode=strict, default_loop_scope=None
collected 15 items
Running 15 items in this shard
test_lora_manager.py ............... [100%]
Signed-off-by: Jee Jee Li <[email protected]> Co-authored-by: Jee Jee Li <[email protected]>
Signed-off-by: Jee Jee Li <[email protected]> Co-authored-by: Jee Jee Li <[email protected]> Signed-off-by: charlifu <[email protected]>
Signed-off-by: Jee Jee Li <[email protected]> Co-authored-by: Jee Jee Li <[email protected]> Signed-off-by: xuebwang-amd <[email protected]>
Signed-off-by: Jee Jee Li <[email protected]> Co-authored-by: Jee Jee Li <[email protected]>
Summary
Testing
https://chatgpt.com/codex/tasks/task_e_68ca49ea72a08329a069e9249f17eb78