-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question]: How to add a custom variable into system prompt? #17333
Comments
Hey there, @minmie! I'm here to help you out with any bugs, questions, or contributions. Let me know how I can assist you! To dynamically update the
Here's how you can implement these steps: from pydantic import Field
from llama_index.core.tools import FunctionTool
from llama_index.agent.openai import OpenAIAgent
# Initial user information
user_info = {
'name': 'arvin',
'age': '24',
'weight': '60kg',
'height': '1.73m',
'hobby': ['basketball', 'listen music'],
}
# Function to update user information
def update_user_info(
property_name: str = Field(
description="property name of user information, like name or age."
),
property_value: str = Field(
description="the property value corresponding to the property name"
),
) -> dict:
"""Useful for updating user-information!"""
user_info[property_name] = property_value
return {property_name: property_value}
# Create a tool from the update function
tool = FunctionTool.from_defaults(update_user_info)
# Function to generate the system prompt
def generate_system_prompt():
return (
"you are a helpful assistant.\n"
"# User-Information\n"
"Below is User's personal information, which may help you to have better responses.\n"
"In conversations with users, they may disclose their personal information, and you should call the update_user_info tool to update this information.\n"
"---------------------\n"
f"{user_info}\n"
"---------------------\n"
)
# Initialize the agent with the initial system prompt
agent = OpenAIAgent.from_tools([tool], verbose=True, system_prompt=generate_system_prompt())
while True:
query = input('user:')
response = agent.chat(message=query)
print(f'bot:{response}')
# After each interaction, update the system prompt with the latest user_info
agent.system_prompt = generate_system_prompt() In this setup, the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
how to dynamic update prompt like qa_template do? @dosu |
To dynamically update the system prompt in a manner similar to how the from llama_index.core.prompts.base import PromptTemplate
custom_prompt_template = PromptTemplate(
"Your custom prompt template here",
prompt_type=PromptType.QUESTION_ANSWER,
)
kg_query_engine = KnowledgeGraphQueryEngine(
storage_context=storage_context,
graph_response_answer_prompt=custom_prompt_template
) Additionally, you can define a custom prompt using a format string and convert it to either text or message prompts, depending on your needs. For example: from llama_index.core import PromptTemplate
template = (
"We have provided context information below. \n"
"---------------------\n"
"{context_str}"
"\n---------------------\n"
"Given this information, please answer the question: {query_str}\n"
)
qa_template = PromptTemplate(template)
# Create text prompt for completion API
prompt = qa_template.format(context_str=..., query_str=...)
# Convert to message prompts for chat API
messages = qa_template.format_messages(context_str=..., query_str=...) You can also update prompts dynamically using the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Could use a variable mapping |
This example doesn't seem to be very helpful. I'm not going to rename existing variables, but rather how to add a new variable (user_info) to the template and pass the value to the new variable. @logan-markewich |
Question Validation
Question
I want to add a custom variable(user_info) into the system prompt and dynamic update it.
I hope when user say "i like fishing, tell me something about it.", the agent will call tool to update user_info and next times agent call LLM, the system prompt will include the latest user_info (which mean hobby includes fishing)
can someone tell me how to do this.
my code is as follow.
The text was updated successfully, but these errors were encountered: