Skip to content

Commit 09ac6b9

Browse files
committed
fix: load bias from config
Signed-off-by: Mohd Muzzammil <[email protected]>
1 parent dd16bdc commit 09ac6b9

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

vllm/model_executor/models/llama_eagle.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@ def __init__(
7171
])
7272
self.fc = torch.nn.Linear(self.config.hidden_size * 2,
7373
self.config.hidden_size,
74-
bias=False)
74+
bias=getattr(self.config, "bias", False))
7575

7676
def forward(
7777
self,

0 commit comments

Comments
 (0)