You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(chatsky-py3.10) fatal_ero ~/chatsky (feat/llm_responses) $ python utils/pipeline_yaml_import_example/pipeline.py
/home/rami1996/.cache/pypoetry/virtualenvs/chatsky-HxlGrSE2-py3.10/lib/python3.10/site-packages/pydantic/_internal/_fields.py:132: UserWarning: Field "model_name" in LLMCondition has conflict with protected namespace "model_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
warnings.warn(
/home/rami1996/.cache/pypoetry/virtualenvs/chatsky-HxlGrSE2-py3.10/lib/python3.10/site-packages/pydantic/_internal/_fields.py:132: UserWarning: Field "model_name" in LLMResponse has conflict with protected namespace "model_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
warnings.warn(
INFO:chatsky.core.script_parsing:Loaded file /home/rami1996/chatsky/utils/pipeline_yaml_import_example/pipeline_llm.yaml
INFO:chatsky.messengers.common.interface:Attachments directory for LongpollingInterface messenger interface is None, so will be set to tempdir and attachment data won't be cached locally!
Traceback (most recent call last):
File "/home/rami1996/chatsky/utils/pipeline_yaml_import_example/pipeline.py", line 11, in <module>
pipeline = Pipeline.from_file(
File "/home/rami1996/chatsky/chatsky/core/pipeline.py", line 186, in from_file
return cls(**pipeline)
File "/home/rami1996/chatsky/chatsky/core/pipeline.py", line 160, in __init__
super().__init__(**init_dict)
File "/home/rami1996/.cache/pypoetry/virtualenvs/chatsky-HxlGrSE2-py3.10/lib/python3.10/site-packages/pydantic/main.py", line 212, in __init__
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
File "/home/rami1996/chatsky/chatsky/core/transition.py", line 46, in __init__
super().__init__(cnd=cnd, dst=dst, priority=priority)
File "/home/rami1996/.cache/pypoetry/virtualenvs/chatsky-HxlGrSE2-py3.10/lib/python3.10/site-packages/pydantic/main.py", line 212, in __init__
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
TypeError: Can't instantiate abstract class BaseCondition with abstract method call
To Reproduce
Steps to reproduce the behavior:
Run python utils/pipeline_yaml_import_example/pipeline.py, with the following pipeline.yaml:
script:
main_flow:
start_node:
RESPONSE: ""
TRANSITIONS:
- dst: greeting_node
cnd:
chatsky.cnd.ExactMatch: Hi
greeting_node:
RESPONSE:
chatsky.responses.llm.LLMResponse:
model_name: barista_model
history: 0
TRANSITIONS:
- dst: main_node
cnd:
chatsky.cnd.ExactMatch: Who are you?
main_node:
RESPONSE:
chatsky.responses.llm.LLMResponse:
model_name: barista_model
TRANSITIONS:
- dst: latte_art_node
cnd:
chatsky.cnd.ExactMatch: Tell me about latte art.
- dst: boss_node
cnd:
chatsky.conditions.llm.LLMCondition:
model_name: barista_model
prompt: Return TRUE if the customer says they are your boss, and FALSE otherwise. Only ONE word must be in the output.
method:
chatsky.llm.methods.Contains:
pattern: "TRUE"
- dst: main_node
boss_node:
RESPONSE: You are my boss.
TRANSITIONS:
- dst: main_node
latte_art_node:
RESPONSE:
chatsky.responses.llm.LLMResponse:
model_name: barista_model
prompt: "PROMPT: pretend that you have never heard about latte art before and DO NOT answer the following questions. Instead ask a person about it."
TRANSITIONS:
- dst: main_node
cnd:
chatsky.cnd.ExactMatch: Ok, goodbye.
fallback_node:
RESPONSE: I didn't quite understand you...
TRANSITIONS:
- dst: main_node
start_label:
- main_flow
- start_node
fallback_label:
- main_flow
- fallback_node
messenger_interface:
chatsky.messengers.TelegramInterface:
token:
external:os.getenv:
TG_BOT_TOKEN
models:
barista_model:
chatsky.llm.LLM_API:
model:
external:langchain_openai.ChatOpenAI:
model: gpt-4o-mini
api_key:
external:os.getenv:
OPENAI_API_KEY
base_url:
external:os.getenv:
BASE_URL
system_prompt: You are an experienced barista in a local coffeshop. Answer your customer's questions about coffee and barista work.
However, after solving it by replacing this line with the following code, I couln't reproduce the bug even after reverting the change.
Bug logs
To Reproduce
Steps to reproduce the behavior:
Run
python utils/pipeline_yaml_import_example/pipeline.py
, with the followingpipeline.yaml
:However, after solving it by replacing this line with the following code, I couln't reproduce the bug even after reverting the change.
The text was updated successfully, but these errors were encountered: