Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatInterface type='messages', API still uses tuples format #9375

Open
1 task done
taoari opened this issue Sep 17, 2024 · 2 comments
Open
1 task done

ChatInterface type='messages', API still uses tuples format #9375

taoari opened this issue Sep 17, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@taoari
Copy link

taoari commented Sep 17, 2024

Describe the bug

When set type="messages" for ChatInterface. In the GUI, history is in messages format, but from API call, history is in tuples format.

Have you searched existing issues? 🔎

  • I have searched and found no existing issues

Reproduction

import gradio as gr

def echo(message, history):
    print(message, history)
    return message

demo = gr.ChatInterface(fn=echo, type='messages', title="Echo Bot")
demo.launch()
from gradio_client import Client

client = Client("http://localhost:7860/")
result = client.predict(
		message="Hello!!",
		api_name="/chat"
)
print(result)

result = client.predict(
		message="Hello again!!",
		api_name="/chat"
)
print(result)

The following is from Gradio Web:

hello []
hello2 [{'role': 'user', 'metadata': {'title': None}, 'content': 'hello'}, {'role': 'assistant', 'metadata': {'title': None}, 'content': 'hello'}]

The following is from gradio_client:

Hello!! []
Hello again!! [['Hello!!', 'Hello!!']]

where history format is inconsistent.

Screenshot

No response

Logs

No response

System Info

Gradio Environment Information:
------------------------------
Operating System: Darwin
gradio version: 4.44.0
gradio_client version: 1.3.0

------------------------------------------------
gradio dependencies in your environment:

aiofiles: 23.2.1
anyio: 4.4.0
fastapi: 0.112.2
ffmpy: 0.4.0
gradio-client==1.3.0 is not installed.
httpx: 0.27.0
huggingface-hub: 0.24.6
importlib-resources: 6.4.0
jinja2: 3.1.4
markupsafe: 2.1.5
matplotlib: 3.9.2
numpy: 1.26.4
orjson: 3.10.7
packaging: 24.1
pandas: 2.2.2
pillow: 10.4.0
pydantic: 2.8.2
pydub: 0.25.1
python-multipart: 0.0.9
pyyaml: 6.0.2
ruff: 0.6.2
semantic-version: 2.10.0
tomlkit==0.12.0 is not installed.
typer: 0.12.5
typing-extensions: 4.11.0
urllib3: 2.2.2
uvicorn: 0.30.6
authlib; extra == 'oauth' is not installed.
itsdangerous; extra == 'oauth' is not installed.


gradio_client dependencies in your environment:

fsspec: 2024.5.0
httpx: 0.27.0
huggingface-hub: 0.24.6
packaging: 24.1
typing-extensions: 4.11.0
websockets: 12.0

Severity

Blocking usage of gradio

@taoari taoari added the bug Something isn't working label Sep 17, 2024
@abidlabs
Copy link
Member

Thanks @taoari for flagging this!

@gtvracer
Copy link

So I'm getting something similar.. when the ChatInterfaces's fn is called, i get this error:

gradio.exceptions.Error: "Data incompatible with messages format. Each message should be a dictionary with 'role' and 'content' keys or a ChatMessage object.

and indeed, my history[] has that metadata component in it too. So in the declaration of fn, I used a lambda function to try to address that by add clean_history() on the history parameter:

def clean_history(history):
# Keep only 'role' and 'content' keys from each history entry
return [{"role": entry["role"], "content": entry["content"]} for entry in history]

gr.ChatInterface(
#fn=respond,
fn=lambda message, history, max_tokens, temperature, top_p: respond(message, clean_history(history), max_tokens_slider.value, temperature_slider.value, top_p_slider.value),
additional_inputs=[max_tokens_slider, temperature_slider, top_p_slider],
type='messages'
)

but I'm still getting the same error...
Any help would be great!

This is running under FastAPI framework. I had this thing running fine with flask, but the app and chatbbot were on different ports. With FastAPI, it's running on the same port by:

gr.mount_gradio_app(app, chatter, "/chat")

where /chat if the src of the iframe it's running in.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants