Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

frontend: added temperature gauge to assistant form #901

Open
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

ezawadski
Copy link
Collaborator

@ezawadski ezawadski commented Jan 9, 2025

Added the Slider Component from the Coral Web interface to the Assistants Web interface.

Added Temperature Setting to Assistant Create and Update forms. This setting is then used when chatting with the assistant.

AI Description

This PR introduces a new temperature parameter to the AgentSettingsForm component and its associated types, allowing users to set the temperature for agents. The temperature parameter is now included in the CreateAgentSettingsFields and UpdateAgentSettingsFields types, as well as in the ConfigStep component.

  • The temperature parameter is added to the AgentSettingsForm component's ConfigStep component, allowing users to set the temperature for agents.
  • The temperature parameter is included in the CreateAgentSettingsFields and UpdateAgentSettingsFields types, ensuring that the temperature is considered when creating or updating agents.
  • The temperature parameter is set to a default value of 0.3 in the DEFAULT_AGENT_TEMPERATURE constant.
  • The temperature parameter is added to the Chat.tsx component, enabling the setting of the temperature for the agent.
  • The temperature parameter is included in the UpdateAgent.tsx component, allowing the temperature to be updated for the agent.
  • The temperature parameter is added to the CreateAgent.tsx component, enabling the setting of the temperature for the agent during creation.
  • The temperature parameter is included in the paramsSlice.ts file, ensuring that the temperature is considered when creating or updating agents.

@codecov-commenter
Copy link

codecov-commenter commented Jan 9, 2025

Codecov Report

Attention: Patch coverage is 42.85714% with 4 lines in your changes missing coverage. Please review.

Project coverage is 77.81%. Comparing base (eb317e9) to head (85beb4e).
Report is 3 commits behind head on main.

Files with missing lines Patch % Lines
src/backend/services/chat.py 20.00% 4 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #901      +/-   ##
==========================================
- Coverage   77.84%   77.81%   -0.03%     
==========================================
  Files         259      259              
  Lines       11177    11181       +4     
==========================================
  Hits         8701     8701              
- Misses       2476     2480       +4     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Collaborator

@EugeneLightsOn EugeneLightsOn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great PR, Eric! Just one small nit, and it’s ready to go.

Copy link
Collaborator

@tianjing-li tianjing-li left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems the backend already had a temperature value on Agents.

Did you check that the Agent's updated temperature value gets set in the chat_request, see process_chat()

I think you might need to set it like these lines

# Set the agent settings in the chat request
chat_request.model = agent.model
chat_request.preamble = agent.preamble

Otherwise frontend changes lgtm

@ezawadski
Copy link
Collaborator Author

Seems the backend already had a temperature value on Agents.

Did you check that the Agent's updated temperature value gets set in the chat_request, see process_chat()

I think you might need to set it like these lines

# Set the agent settings in the chat request
chat_request.model = agent.model
chat_request.preamble = agent.preamble

Otherwise frontend changes lgtm

@tianjing-li

The backend uses the temperature passed with the ChatRequest when the chat endpoint is called. I think this is better then defaulting to what is set on the Model because it allows a user to potentially change the temperature between messages. I have the frontend take the value from the model and pass it with the request to the backend.

We keep this business logic in the frontend and leave the backend more flexible and scalable.

@tianjing-li
Copy link
Collaborator

@ezawadski I believe the opposite might happen if we don't enforce these values being used by the backend.

We could send temperature=0.8 in the request body to POST /chat-stream for example while the Agent actually has a temperature set to 0.5, and it would use the 0.8 value rather than the one on the Agent, this is my understanding at least. We can chat more next week online if needed

@ezawadski ezawadski requested review from malexw and a team as code owners January 13, 2025 18:07
@ezawadski ezawadski requested a review from tianjing-li January 13, 2025 18:51
Copy link
Collaborator

@tianjing-li tianjing-li left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants