Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: How to add a custom variable into system prompt? #17333

Open
1 task done
minmie opened this issue Dec 20, 2024 · 5 comments
Open
1 task done

[Question]: How to add a custom variable into system prompt? #17333

minmie opened this issue Dec 20, 2024 · 5 comments
Labels
question Further information is requested

Comments

@minmie
Copy link

minmie commented Dec 20, 2024

Question Validation

  • I have searched both the documentation and discord for an answer.

Question

I want to add a custom variable(user_info) into the system prompt and dynamic update it.

I hope when user say "i like fishing, tell me something about it.", the agent will call tool to update user_info and next times agent call LLM, the system prompt will include the latest user_info (which mean hobby includes fishing)

can someone tell me how to do this.

my code is as follow.

from pydantic import Field
from llama_index.embeddings.huggingface import HuggingFaceEmbedding
from llama_index.llms.openai.utils import ALL_AVAILABLE_MODELS, CHAT_MODELS
from llama_index.llms.openai import OpenAI
from llama_index.core import Settings
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.core.tools import FunctionTool
from llama_index.agent.openai import OpenAIAgent
import json


user_info = {
    'name': 'arvin',
    'age': '24',
    'weight': '60kg',
    'height': '1.73m',
    'hobby': ['basketball', 'listen music'],
}


system_prompt = (
    "you are a helpful assistant.\n"
    "# User-Information\n"
    "Below is User's personal information, which may help you to have a better responses.\n"
    "In conversations with users, they may disclose their personal information, and you should call the update_user_info tool to update this information.\n"
    "---------------------\n"
    "{user_info}\n"
    "---------------------\n"
)

def update_user_info(
    property_name: str = Field(
        description="property name of user information, like name or age."
    ),
    property_value: str = Field(
            description="the property value corresponding to the property name"
        ),
) -> dict:
    """Usfeful for update user-information!"""

    return {property_name: property_value}

tool  = FunctionTool.from_defaults(update_user_info)

agent = OpenAIAgent.from_tools([tool], verbose=True, system_prompt=system_prompt)



while True:

    query = input('user:')
    response = agent.chat(message=query)
    print(f'bot:{response}')
@minmie minmie added the question Further information is requested label Dec 20, 2024
Copy link

dosubot bot commented Dec 20, 2024

Hey there, @minmie! I'm here to help you out with any bugs, questions, or contributions. Let me know how I can assist you!

To dynamically update the user_info variable in the system prompt during runtime, you can follow these steps:

  1. Modify the update_user_info function: Ensure that this function updates the user_info dictionary with new information provided by the user.

  2. Update the system prompt: After updating the user_info, you need to regenerate the system prompt with the updated information.

  3. Integrate the update logic in the chat loop: After each user input, check if the update_user_info tool was called and update the system prompt accordingly.

Here's how you can implement these steps:

from pydantic import Field
from llama_index.core.tools import FunctionTool
from llama_index.agent.openai import OpenAIAgent

# Initial user information
user_info = {
    'name': 'arvin',
    'age': '24',
    'weight': '60kg',
    'height': '1.73m',
    'hobby': ['basketball', 'listen music'],
}

# Function to update user information
def update_user_info(
    property_name: str = Field(
        description="property name of user information, like name or age."
    ),
    property_value: str = Field(
        description="the property value corresponding to the property name"
    ),
) -> dict:
    """Useful for updating user-information!"""
    user_info[property_name] = property_value
    return {property_name: property_value}

# Create a tool from the update function
tool = FunctionTool.from_defaults(update_user_info)

# Function to generate the system prompt
def generate_system_prompt():
    return (
        "you are a helpful assistant.\n"
        "# User-Information\n"
        "Below is User's personal information, which may help you to have better responses.\n"
        "In conversations with users, they may disclose their personal information, and you should call the update_user_info tool to update this information.\n"
        "---------------------\n"
        f"{user_info}\n"
        "---------------------\n"
    )

# Initialize the agent with the initial system prompt
agent = OpenAIAgent.from_tools([tool], verbose=True, system_prompt=generate_system_prompt())

while True:
    query = input('user:')
    response = agent.chat(message=query)
    print(f'bot:{response}')

    # After each interaction, update the system prompt with the latest user_info
    agent.system_prompt = generate_system_prompt()

In this setup, the update_user_info function modifies the user_info dictionary directly. After each user interaction, the system prompt is regenerated to include the latest user information, ensuring that the agent has the most up-to-date context for future interactions.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@minmie
Copy link
Author

minmie commented Dec 20, 2024

how to dynamic update prompt like qa_template do? @dosu

Copy link

dosubot bot commented Dec 20, 2024

To dynamically update the system prompt in a manner similar to how the qa_template does, you can use the graph_response_answer_prompt parameter when initializing the KnowledgeGraphQueryEngine. Here's an example:

from llama_index.core.prompts.base import PromptTemplate

custom_prompt_template = PromptTemplate(
    "Your custom prompt template here",
    prompt_type=PromptType.QUESTION_ANSWER,
)

kg_query_engine = KnowledgeGraphQueryEngine(
    storage_context=storage_context,
    graph_response_answer_prompt=custom_prompt_template
)

Additionally, you can define a custom prompt using a format string and convert it to either text or message prompts, depending on your needs. For example:

from llama_index.core import PromptTemplate

template = (
    "We have provided context information below. \n"
    "---------------------\n"
    "{context_str}"
    "\n---------------------\n"
    "Given this information, please answer the question: {query_str}\n"
)
qa_template = PromptTemplate(template)

# Create text prompt for completion API
prompt = qa_template.format(context_str=..., query_str=...)

# Convert to message prompts for chat API
messages = qa_template.format_messages(context_str=..., query_str=...)

You can also update prompts dynamically using the update_prompts function on modules that implement get_prompts. This allows you to pass in custom prompts at query-time or during index construction, providing flexibility in how prompts are used across different operations [1][2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@logan-markewich
Copy link
Collaborator

@minmie
Copy link
Author

minmie commented Dec 20, 2024

Could use a variable mapping https://docs.llamaindex.ai/en/stable/examples/prompts/advanced_prompts/#3-prompt-function-mappings

This example doesn't seem to be very helpful. I'm not going to rename existing variables, but rather how to add a new variable (user_info) to the template and pass the value to the new variable. @logan-markewich

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants