-
-
Notifications
You must be signed in to change notification settings - Fork 10.3k
[Docs] Fix warnings in mkdocs build
#23649
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WARNING - griffe: vllm/model_executor/layers/lightning_attn.py:461: No type or annotation for parameter 'q' WARNING - griffe: vllm/model_executor/layers/lightning_attn.py:462: No type or annotation for parameter 'k' WARNING - griffe: vllm/model_executor/layers/lightning_attn.py:463: No type or annotation for parameter 'v' WARNING - griffe: vllm/model_executor/layers/lightning_attn.py:464: No type or annotation for parameter 'ed' WARNING - griffe: vllm/model_executor/layers/lightning_attn.py:465: No type or annotation for parameter 'block_size' WARNING - griffe: vllm/model_executor/layers/lightning_attn.py:466: No type or annotation for parameter 'kv_history' WARNING - griffe: vllm/model_executor/layers/lightning_attn.py:469: No type or annotation for returned value 'output' WARNING - griffe: vllm/model_executor/layers/lightning_attn.py:470: No type or annotation for returned value 'kv' Signed-off-by: Zerohertz <[email protected]>
WARNING - griffe: vllm/model_executor/layers/linear.py:236: No type or annotation for parameter 'bias' WARNING - griffe: vllm/model_executor/layers/linear.py:236: Parameter 'bias' does not appear in the function signature WARNING - griffe: vllm/model_executor/layers/linear.py:381: No type or annotation for parameter 'output_size' WARNING - griffe: vllm/model_executor/layers/linear.py:381: Parameter 'output_size' does not appear in the function signature Signed-off-by: Zerohertz <[email protected]>
e313771
to
4e192ab
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this PR! Don't feel you have to do this all in one go. You'll likely run into lots of merge conflicts as main
changes if you do.
Great, thank you! I will work on this and aim to finalize the PR by tomorrow. I also have a quick question. To achieve a faster Sharing that information would be a great help to my work. |
Co-authored-by: Harry Mellor <[email protected]> Signed-off-by: Hyogeun Oh (오효근) <[email protected]>
WARNING - griffe: vllm/attention/backends/differential_flash_attn.py:808: No type or annotation for parameter 'query' WARNING - griffe: vllm/attention/backends/differential_flash_attn.py:808: Parameter 'query' does not appear in the function signature WARNING - griffe: vllm/attention/backends/differential_flash_attn.py:809: No type or annotation for parameter 'key' WARNING - griffe: vllm/attention/backends/differential_flash_attn.py:809: Parameter 'key' does not appear in the function signature WARNING - griffe: vllm/attention/backends/differential_flash_attn.py:810: No type or annotation for parameter 'value' WARNING - griffe: vllm/attention/backends/differential_flash_attn.py:810: Parameter 'value' does not appear in the function signature WARNING - griffe: vllm/attention/backends/differential_flash_attn.py:812: Failed to get 'name: description' pair from 'kv_cache = [2, num_blocks, block_size, num_kv_heads, head_size]' Signed-off-by: Zerohertz <[email protected]>
WARNING - griffe: vllm/engine/async_llm_engine.py:482: No type or annotation for parameter '*args' WARNING - griffe: vllm/engine/async_llm_engine.py:483: No type or annotation for parameter '**kwargs' Signed-off-by: Zerohertz <[email protected]>
WARNING - griffe: vllm/attention/backends/xformers.py:647: No type or annotation for parameter 'output' WARNING - griffe: vllm/attention/backends/xformers.py:647: Parameter 'output' does not appear in the function signature WARNING - griffe: vllm/attention/backends/xformers.py:477: Failed to get 'name: description' pair from 'kv_cache = [2, num_blocks, block_size * num_kv_heads * head_size]' WARNING - griffe: vllm/attention/backends/xformers.py:481: No type or annotation for parameter 'attn_type' WARNING - griffe: vllm/attention/backends/xformers.py:481: Parameter 'attn_type' does not appear in the function signature Signed-off-by: Zerohertz <[email protected]>
You might be able to play with Lines 75 to 81 in fdeb3da
exclude .
The docs for the API generation package we use is https://github.com/tlambert03/mkdocs-api-autonav. The docs for the package that then renders the generated API docs is https://mkdocstrings.github.io/python/usage/ (this one might be less helpful for reducing the build time but more information can't hurt). For bonus points, if you see anything that could implement the best practices from https://mkdocstrings.github.io/griffe/guide/users/recommendations/python-code (the package that |
WARNING - griffe: vllm/attention/backends/utils.py:563: Failed to get 'exception: description' pair from 'is `None` when required for the calculations.' Signed-off-by: Zerohertz <[email protected]>
WARNING - griffe: vllm/entrypoints/llm.py:150: No type or annotation for parameter '**kwargs' Signed-off-by: Zerohertz <[email protected]>
Signed-off-by: Zerohertz <[email protected]>
WARNING - griffe: vllm/attention/backends/flash_attn.py:608: Failed to get 'name: description' pair from 'kv_cache = [2, num_blocks, block_size, num_kv_heads, head_size]' WARNING - griffe: vllm/attention/backends/flash_attn.py:864: No type or annotation for parameter 'attn_metadata' Signed-off-by: Zerohertz <[email protected]>
WARNING - griffe: vllm/attention/backends/rocm_flash_attn.py:590: Failed to get 'name: description' pair from 'kv_cache = [2, num_blocks, block_size * num_kv_heads * head_size]' WARNING - griffe: vllm/attention/backends/rocm_flash_attn.py:594: No type or annotation for parameter 'attn_type' WARNING - griffe: vllm/attention/backends/rocm_flash_attn.py:594: Parameter 'attn_type' does not appear in the function signature Signed-off-by: Zerohertz <[email protected]>
Signed-off-by: Zerohertz <[email protected]>
logits_processors: Optional[list[Union[str, | ||
type[LogitsProcessor]]]] = None, | ||
**kwargs, | ||
**kwargs: Any, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I initially used object
for the **kwargs
type hint, as I had seen it used in most other codes.
However, this caused issues with mypy
, so I changed it to Any
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks like you did the right thing. According to https://peps.python.org/pep-0484/#arbitrary-argument-lists-and-default-argument-values you should type hint the type of the expected values of the contents of **kwargs
. Since we use **kwargs
to pass arbitrary keyword arguments, Any
is an apprioriate choice.
WARNING - griffe: vllm/sequence.py:150: No type or annotation for parameter 'prompt_token_ids' WARNING - griffe: vllm/sequence.py:150: Parameter 'prompt_token_ids' does not appear in the function signature WARNING - griffe: vllm/sequence.py:151: No type or annotation for parameter 'output_token_ids' WARNING - griffe: vllm/sequence.py:151: Parameter 'output_token_ids' does not appear in the function signature WARNING - griffe: vllm/sequence.py:941: No type or annotation for parameter 'request_id' WARNING - griffe: vllm/sequence.py:941: Parameter 'request_id' does not appear in the function signature WARNING - griffe: vllm/sequence.py:942: No type or annotation for parameter 'is_prompt' WARNING - griffe: vllm/sequence.py:942: Parameter 'is_prompt' does not appear in the function signature WARNING - griffe: vllm/sequence.py:943: No type or annotation for parameter 'seq_data' WARNING - griffe: vllm/sequence.py:943: Parameter 'seq_data' does not appear in the function signature WARNING - griffe: vllm/sequence.py:944: No type or annotation for parameter 'sampling_params' WARNING - griffe: vllm/sequence.py:944: Parameter 'sampling_params' does not appear in the function signature WARNING - griffe: vllm/sequence.py:945: No type or annotation for parameter 'block_tables' WARNING - griffe: vllm/sequence.py:945: Parameter 'block_tables' does not appear in the function signature WARNING - griffe: vllm/sequence.py:947: No type or annotation for parameter 'do_sample' WARNING - griffe: vllm/sequence.py:947: Parameter 'do_sample' does not appear in the function signature WARNING - griffe: vllm/sequence.py:950: No type or annotation for parameter 'token_chunk_size' WARNING - griffe: vllm/sequence.py:950: Parameter 'token_chunk_size' does not appear in the function signature WARNING - griffe: vllm/sequence.py:952: No type or annotation for parameter 'lora_request' WARNING - griffe: vllm/sequence.py:952: Parameter 'lora_request' does not appear in the function signature WARNING - griffe: vllm/sequence.py:953: No type or annotation for parameter 'computed_block_nums' WARNING - griffe: vllm/sequence.py:953: Parameter 'computed_block_nums' does not appear in the function signature WARNING - griffe: vllm/sequence.py:955: No type or annotation for parameter 'state' WARNING - griffe: vllm/sequence.py:955: Parameter 'state' does not appear in the function signature WARNING - griffe: vllm/sequence.py:956: No type or annotation for parameter 'multi_modal_data' WARNING - griffe: vllm/sequence.py:956: Parameter 'multi_modal_data' does not appear in the function signature WARNING - griffe: vllm/sequence.py:957: No type or annotation for parameter 'mm_processor_kwargs' WARNING - griffe: vllm/sequence.py:957: Parameter 'mm_processor_kwargs' does not appear in the function signature WARNING - griffe: vllm/sequence.py:958: No type or annotation for parameter 'encoder_seq_data' WARNING - griffe: vllm/sequence.py:958: Parameter 'encoder_seq_data' does not appear in the function signature WARNING - griffe: vllm/sequence.py:962: No type or annotation for parameter 'cross_block_table' WARNING - griffe: vllm/sequence.py:962: Parameter 'cross_block_table' does not appear in the function signature WARNING - griffe: vllm/sequence.py:1044: No type or annotation for parameter 'parent_seq_id' WARNING - griffe: vllm/sequence.py:1044: Parameter 'parent_seq_id' does not appear in the function signature WARNING - griffe: vllm/sequence.py:1046: No type or annotation for parameter 'output_token' WARNING - griffe: vllm/sequence.py:1046: Parameter 'output_token' does not appear in the function signature WARNING - griffe: vllm/sequence.py:1047: No type or annotation for parameter 'logprobs' WARNING - griffe: vllm/sequence.py:1047: Parameter 'logprobs' does not appear in the function signature Signed-off-by: Zerohertz <[email protected]>
WARNING - griffe: vllm/entrypoints/openai/tool_parsers/minimax_tool_parser.py:469: No type or annotation for parameter 'args_match' Signed-off-by: Zerohertz <[email protected]>
WARNING - griffe: vllm/outputs.py:450: Failed to get 'name: description' pair from 'Its length depends on the number of classes.' WARNING - griffe: vllm/outputs.py:412: Failed to get 'name: description' pair from 'Its length depends on the hidden dimension of the model.' Signed-off-by: Zerohertz <[email protected]>
WARNING - mkdocs_autorefs: api/vllm/index.md: from vllm/vllm/entrypoints/llm.py:696: (vllm.LLM.chat) Could not find cross-reference target 'generate' WARNING - mkdocs_autorefs: api/vllm/index.md: from vllm/vllm/entrypoints/llm.py:1336: (vllm.LLM.wake_up) Could not find cross-reference target 'sleep' WARNING - mkdocs_autorefs: api/vllm/index.md: from vllm/vllm/engine/llm_engine.py:623: (vllm.LLMEngine.add_request) Could not find cross-reference target 'vllm.Sequence' WARNING - mkdocs_autorefs: api/vllm/index.md: from vllm/vllm/engine/llm_engine.py:623: (vllm.LLMEngine.add_request) Could not find cross-reference target 'vllm.SequenceGroup' WARNING - mkdocs_autorefs: api/vllm/index.md: from vllm/vllm/engine/llm_engine.py:623: (vllm.LLMEngine.add_request) Could not find cross-reference target 'vllm.Sequence' WARNING - mkdocs_autorefs: api/vllm/index.md: from vllm/vllm/engine/llm_engine.py:623: (vllm.LLMEngine.add_request) Could not find cross-reference target 'vllm.SequenceGroup' WARNING - mkdocs_autorefs: api/vllm/engine/llm_engine.md: from vllm/vllm/engine/llm_engine.py:623: (vllm.engine.llm_engine.LLMEngine.add_request) Could not find cross-reference target 'vllm.Sequence' WARNING - mkdocs_autorefs: api/vllm/engine/llm_engine.md: from vllm/vllm/engine/llm_engine.py:623: (vllm.engine.llm_engine.LLMEngine.add_request) Could not find cross-reference target 'vllm.SequenceGroup' WARNING - mkdocs_autorefs: api/vllm/engine/llm_engine.md: from vllm/vllm/engine/llm_engine.py:623: (vllm.engine.llm_engine.LLMEngine.add_request) Could not find cross-reference target 'vllm.Sequence' WARNING - mkdocs_autorefs: api/vllm/engine/llm_engine.md: from vllm/vllm/engine/llm_engine.py:623: (vllm.engine.llm_engine.LLMEngine.add_request) Could not find cross-reference target 'vllm.SequenceGroup' WARNING - mkdocs_autorefs: api/vllm/entrypoints/llm.md: from vllm/vllm/entrypoints/llm.py:696: (vllm.entrypoints.llm.LLM.chat) Could not find cross-reference target 'generate' WARNING - mkdocs_autorefs: api/vllm/entrypoints/llm.md: from vllm/vllm/entrypoints/llm.py:1336: (vllm.entrypoints.llm.LLM.wake_up) Could not find cross-reference target 'sleep' Signed-off-by: Zerohertz <[email protected]>
Some more info to keep in the thread. As documented in https://mkdocstrings.github.io/python/usage/docstrings/google/#docstring any section that isn't Google style will be transformed into a MkDocs admonition. |
WARNING - griffe: vllm/core/block_manager.py:476: Parameter 'sequence_group' does not appear in the function signature WARNING - griffe: vllm/core/block_manager.py:355: Parameter 'sequence_group' does not appear in the function signature WARNING - griffe: vllm/core/block_manager.py:408: Parameter 'num_lookahead_slots' does not appear in the function signature WARNING - griffe: vllm/core/block_manager.py:423: Parameter 'sequence_group' does not appear in the function signature Signed-off-by: Zerohertz <[email protected]>
Tip - api-autonav:
modules: ["vllm"]
api_root_uri: "api"
exclude:
- "re:vllm\\._.*"
- "vllm.third_party"
- "vllm.vllm_flash_attn"
- "re:vllm\\.(?!core).*" # This can decrease build time (but can cause auto ref issue, you can ignore it) |
As you mentioned, I'm also concerned about potential conflicts, so I'll remove the "WIP" and keep the PR open. |
mkdocs build
mkdocs build
Signed-off-by: Harry Mellor <[email protected]>
Signed-off-by: Harry Mellor <[email protected]>
prompt_token_ids: The token IDs of the prompt. | ||
output_token_ids: The token IDs of the output. | ||
cumulative_logprob: The cumulative log probability of the output. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I moved these to their respective @property
methods because this class seems to have no public attributes
Signed-off-by: Zerohertz <[email protected]> Signed-off-by: Hyogeun Oh (오효근) <[email protected]> Signed-off-by: Harry Mellor <[email protected]> Co-authored-by: Harry Mellor <[email protected]> Signed-off-by: tc-mb <[email protected]>
Signed-off-by: Zerohertz <[email protected]> Signed-off-by: Hyogeun Oh (오효근) <[email protected]> Signed-off-by: Harry Mellor <[email protected]> Co-authored-by: Harry Mellor <[email protected]>
Signed-off-by: Zerohertz <[email protected]> Signed-off-by: Hyogeun Oh (오효근) <[email protected]> Signed-off-by: Harry Mellor <[email protected]> Co-authored-by: Harry Mellor <[email protected]> Signed-off-by: Xiao Yu <[email protected]>
Signed-off-by: Zerohertz <[email protected]> Signed-off-by: Hyogeun Oh (오효근) <[email protected]> Signed-off-by: Harry Mellor <[email protected]> Co-authored-by: Harry Mellor <[email protected]> Signed-off-by: Xiao Yu <[email protected]>
Signed-off-by: Zerohertz <[email protected]> Signed-off-by: Hyogeun Oh (오효근) <[email protected]> Signed-off-by: Harry Mellor <[email protected]> Co-authored-by: Harry Mellor <[email protected]>
Signed-off-by: Zerohertz <[email protected]> Signed-off-by: Hyogeun Oh (오효근) <[email protected]> Signed-off-by: Harry Mellor <[email protected]> Co-authored-by: Harry Mellor <[email protected]>
Signed-off-by: Zerohertz <[email protected]> Signed-off-by: Hyogeun Oh (오효근) <[email protected]> Signed-off-by: Harry Mellor <[email protected]> Co-authored-by: Harry Mellor <[email protected]>
Signed-off-by: Zerohertz <[email protected]> Signed-off-by: Hyogeun Oh (오효근) <[email protected]> Signed-off-by: Harry Mellor <[email protected]> Co-authored-by: Harry Mellor <[email protected]>
Signed-off-by: Zerohertz <[email protected]> Signed-off-by: Hyogeun Oh (오효근) <[email protected]> Signed-off-by: Harry Mellor <[email protected]> Co-authored-by: Harry Mellor <[email protected]> Signed-off-by: Matthew Bonanni <[email protected]>
Signed-off-by: Zerohertz <[email protected]> Signed-off-by: Hyogeun Oh (오효근) <[email protected]> Signed-off-by: Harry Mellor <[email protected]> Co-authored-by: Harry Mellor <[email protected]>
Signed-off-by: Zerohertz <[email protected]> Signed-off-by: Hyogeun Oh (오효근) <[email protected]> Signed-off-by: Harry Mellor <[email protected]> Co-authored-by: Harry Mellor <[email protected]> Signed-off-by: Shiyan Deng <[email protected]>
Signed-off-by: Zerohertz <[email protected]> Signed-off-by: Hyogeun Oh (오효근) <[email protected]> Signed-off-by: Harry Mellor <[email protected]> Co-authored-by: Harry Mellor <[email protected]>
As discussed with @hmellor on Slack, this PR addresses the warnings generated when running mkdocs build. (#22588)
Please let me know if you have any feedback!
Note
Warnings fixed: