vLLM: Move from guided_options_request to structured_outputs by suryabdev · Pull Request #1805 · huggingface/smolagents (original) (raw)
This is a PR to fix, #1794
With VLLM > 0.10.1, The guided_options_request was deprecated. This moves to the recommened structured_outputs approach
https://github.com/vllm-project/vllm/blob/main/docs/features/structured_outputs.md#structured-outputs
!!! warning If you are still using the following deprecated API fields, please update your code to use structured_outputs as demonstrated in the rest of this document:
- `guided_json` -> `{"structured_outputs": {"json": ...}}` or `StructuredOutputsParams(json=...)`
Based on my understanding of the changelog (https://github.com/vllm-project/vllm/releases), They deprecated the parameter in 0.10.2 (Last month) and completely removed support for V0 APIs in 0.11 (Last week)
Tested it with the following code
from smolagents import VLLMModel, CodeAgent
model = VLLMModel(model_id="HuggingFaceTB/SmolLM2-360M-Instruct")
agent = CodeAgent(model=model, tools=[])
agent.run("print the first 10 integers")
and I was able to reproduce the issue
INFO 10-10 11:51:29 [llm.py:306] Supported_tasks: ['generate']
╭─────────────────────────────────────────────────────────────────────────────────────────────────── New run ───────────────────────────────────────────────────────────────────────────────────────────────────╮
│ │
│ Hello │
│ │
╰─ VLLMModel - HuggingFaceTB/SmolLM2-360M-Instruct ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ Step 1 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Error in generating model output:
LLM.generate() got an unexpected keyword argument 'guided_options_request'
