Skip to content

[python] Use vllm chat object #5662

[python] Use vllm chat object

[python] Use vllm chat object #5662

Triggered via pull request January 14, 2025 23:16
@xyang16xyang16
synchronize #2659
xyang16:chat
Status Failure
Total duration 11m 19s
Artifacts 2

continuous.yml

on: pull_request
Matrix: build
Fit to window
Zoom out
Zoom in

Annotations

1 error and 3 warnings
build (ubuntu-latest)
Process completed with exit code 1.
build (ubuntu-latest)
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
build (ubuntu-latest)
No files were found with the provided path: awscurl/build/reports benchmark/build/reports serving/build/reports wlm/build/reports engines/python/build/reports. No artifacts will be uploaded.
build (windows-latest)
Failed to save: "C:\Program failed with error: The process 'C:\Program Files\Git\usr\bin\tar.exe' failed with exit code 2

Artifacts

Produced during runtime
Name Size
serving-macos-latest
1.18 MB
serving-windows-latest
1.18 MB