diff --git a/README.md b/README.md index 13afb112caa..79c6aa1cc4a 100644 --- a/README.md +++ b/README.md @@ -133,8 +133,9 @@ pipeline("the secret to baking a really good cake is ") To chat with a model, the usage pattern is the same. The only difference is you need to construct a chat history (the input to `Pipeline`) between you and the system. > [!TIP] -> You can also chat with a model directly from the command line. +> You can also chat with a model directly from the command line. Please make sure you have the `chat` extra installed: > ```shell +> pip install .[chat] # or pip install transformers[chat] > transformers chat Qwen/Qwen2.5-0.5B-Instruct > ``` diff --git a/docs/source/en/conversations.md b/docs/source/en/conversations.md index f661eb91e58..005a4386a05 100644 --- a/docs/source/en/conversations.md +++ b/docs/source/en/conversations.md @@ -31,6 +31,13 @@ This guide shows you how to quickly start chatting with Transformers from the co ### Interactive chat session After you've [installed Transformers](./installation.md), chat with a model directly from the command line as shown below. It launches an interactive session with a model, with a few base commands listed at the start of the session. +For this to work, you need to have installed the chat extra: + +```bash +pip install transformers[chat] +``` + +You can then launch an interactive session as follows: ```bash transformers chat Qwen/Qwen2.5-0.5B-Instruct diff --git a/setup.py b/setup.py index a3881418841..97eb0dcc88a 100644 --- a/setup.py +++ b/setup.py @@ -314,6 +314,7 @@ extras["hub-kernels"] = deps_list("kernels") extras["integrations"] = extras["hub-kernels"] + extras["optuna"] + extras["ray"] + extras["sigopt"] extras["serving"] = deps_list("pydantic", "uvicorn", "fastapi", "starlette") +extras["chat"] = deps_list("aiohttp", "rich") extras["audio"] = deps_list( "librosa", "pyctcdecode",