Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] TypeError: Can't instantiate abstract class BaseCondition with abstract method call #412

Closed
Ramimashkouk opened this issue Dec 21, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@Ramimashkouk
Copy link
Member

Bug logs

(chatsky-py3.10) fatal_ero ~/chatsky (feat/llm_responses) $ python utils/pipeline_yaml_import_example/pipeline.py 
/home/rami1996/.cache/pypoetry/virtualenvs/chatsky-HxlGrSE2-py3.10/lib/python3.10/site-packages/pydantic/_internal/_fields.py:132: UserWarning: Field "model_name" in LLMCondition has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
/home/rami1996/.cache/pypoetry/virtualenvs/chatsky-HxlGrSE2-py3.10/lib/python3.10/site-packages/pydantic/_internal/_fields.py:132: UserWarning: Field "model_name" in LLMResponse has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
INFO:chatsky.core.script_parsing:Loaded file /home/rami1996/chatsky/utils/pipeline_yaml_import_example/pipeline_llm.yaml
INFO:chatsky.messengers.common.interface:Attachments directory for LongpollingInterface messenger interface is None, so will be set to tempdir and attachment data won't be cached locally!
Traceback (most recent call last):
  File "/home/rami1996/chatsky/utils/pipeline_yaml_import_example/pipeline.py", line 11, in <module>
    pipeline = Pipeline.from_file(
  File "/home/rami1996/chatsky/chatsky/core/pipeline.py", line 186, in from_file
    return cls(**pipeline)
  File "/home/rami1996/chatsky/chatsky/core/pipeline.py", line 160, in __init__
    super().__init__(**init_dict)
  File "/home/rami1996/.cache/pypoetry/virtualenvs/chatsky-HxlGrSE2-py3.10/lib/python3.10/site-packages/pydantic/main.py", line 212, in __init__
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
  File "/home/rami1996/chatsky/chatsky/core/transition.py", line 46, in __init__
    super().__init__(cnd=cnd, dst=dst, priority=priority)
  File "/home/rami1996/.cache/pypoetry/virtualenvs/chatsky-HxlGrSE2-py3.10/lib/python3.10/site-packages/pydantic/main.py", line 212, in __init__
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
TypeError: Can't instantiate abstract class BaseCondition with abstract method call

To Reproduce
Steps to reproduce the behavior:
Run python utils/pipeline_yaml_import_example/pipeline.py, with the following pipeline.yaml:

script:
  main_flow:
    start_node:
      RESPONSE: ""
      TRANSITIONS:
        - dst: greeting_node
          cnd:
            chatsky.cnd.ExactMatch: Hi
    greeting_node:
      RESPONSE:
        chatsky.responses.llm.LLMResponse:
          model_name: barista_model
          history: 0
      TRANSITIONS:
        - dst: main_node
          cnd:
            chatsky.cnd.ExactMatch: Who are you?
    main_node:
      RESPONSE:
        chatsky.responses.llm.LLMResponse:
          model_name: barista_model
      TRANSITIONS:
        - dst: latte_art_node
          cnd:
            chatsky.cnd.ExactMatch: Tell me about latte art.
        - dst: boss_node
          cnd:
            chatsky.conditions.llm.LLMCondition:
              model_name: barista_model
              prompt: Return TRUE if the customer says they are your boss, and FALSE otherwise. Only ONE word must be in the output.
              method:
                chatsky.llm.methods.Contains:
                  pattern: "TRUE"
        - dst: main_node
    boss_node:
      RESPONSE: You are my boss.
      TRANSITIONS:
        - dst: main_node
    latte_art_node:
      RESPONSE:
        chatsky.responses.llm.LLMResponse:
          model_name: barista_model
          prompt: "PROMPT: pretend that you have never heard about latte art before and DO NOT answer the following questions. Instead ask a person about it."
      TRANSITIONS:
        - dst: main_node
          cnd:
            chatsky.cnd.ExactMatch: Ok, goodbye.
    fallback_node:
      RESPONSE: I didn't quite understand you...
      TRANSITIONS:
        - dst: main_node

start_label:
  - main_flow
  - start_node
fallback_label:
  - main_flow
  - fallback_node
messenger_interface:
  chatsky.messengers.TelegramInterface:
    token:
      external:os.getenv:
        TG_BOT_TOKEN
models:
  barista_model:
    chatsky.llm.LLM_API:
      model:
        external:langchain_openai.ChatOpenAI:
          model: gpt-4o-mini
          api_key:
            external:os.getenv:
              OPENAI_API_KEY
          base_url:
            external:os.getenv:
              BASE_URL
      system_prompt: You are an experienced barista in a local coffeshop. Answer your customer's questions about coffee and barista work.

However, after solving it by replacing this line with the following code, I couln't reproduce the bug even after reverting the change.

return_type: ClassVar[Annotated[Union[type, Tuple[type, ...]], Field(union_mode="left_to_right")]] = bool
@Ramimashkouk Ramimashkouk added the bug Something isn't working label Dec 21, 2024
@RLKRo
Copy link
Member

RLKRo commented Jan 16, 2025

I can't reproduce it either in feat/llm_responses branch.

Could be an issue with trying to instantiate ConstCondition.

Reopen if it's possible to reproduce.

@RLKRo RLKRo closed this as not planned Won't fix, can't repro, duplicate, stale Jan 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants