Skip to content

Fix max_tokens handling in vllm_vlms.py (#2637) #4098

Fix max_tokens handling in vllm_vlms.py (#2637)

Fix max_tokens handling in vllm_vlms.py (#2637) #4098

Triggered via push January 21, 2025 16:55
Status Success
Total duration 12s
Artifacts

new_tasks.yml

on: push
Scan for changed tasks
3s
Scan for changed tasks
Fit to window
Zoom out
Zoom in

Annotations

1 warning
Scan for changed tasks
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636