Fix warning about parameter max_new_tokens by albertvillanova · Pull Request #1721 · huggingface/smolagents (original) (raw)

Fix warning about parameter max_new_tokens: https://github.com/huggingface/smolagents/actions/runs/17234970996/job/48897627718?pr=1690

UserWarning: max_new_tokens not provided, using this default value for max_new_tokens: 4096

tests/test_cli.py::test_load_model_transformers_model tests/test_models.py::TestTransformersModel::test_init[patching0] tests/test_models.py::TestTransformersModel::test_init[patching1] tests/test_models.py::test_flatten_messages_as_text_for_all_models[TransformersModel-model_kwargs9-patching9-True] tests/test_models.py::test_flatten_messages_as_text_for_all_models[TransformersModel-model_kwargs10-patching10-False] /opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/site-packages/smolagents/models.py:848: UserWarning: max_new_tokens not provided, using this default value for max_new_tokens: 4096

This PR sets default max_new_tokens explicitly without warning.