Skip to content

Conversation

bythew3i
Copy link
Contributor

Tested:

python test/test_pallas.py -v -k PallasTest.test_ragged_paged_attention_wrapper_with_dynamo

Taking new factors into account for auto tunning:

  • q_dtype_name
  • kv_dtype_name
  • num_q_heads_per_blk
  • num_kv_heads_per_blk
  • head_dim
  • page_size
  • max_num_batched_tokens
  • max_model_len = page_size * pages_per_seq

@bythew3i
Copy link
Contributor Author

@bythew3i
Copy link
Contributor Author

@bvrockwell

Copy link
Collaborator

@yaochengji yaochengji left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for your great contribution!

@yaochengji yaochengji merged commit c4b45a9 into pytorch:master Apr 28, 2025
45 of 46 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants