diff --git a/Add_New_Model_Providers_Guide.md b/Add_New_Model_Providers_Guide.md new file mode 100644 index 0000000..3c5e018 --- /dev/null +++ b/Add_New_Model_Providers_Guide.md @@ -0,0 +1,221 @@ +# 🤖 Models Guide + +## Supported Models + +Flock currently supports various model providers and their models: + +### OpenAI +- GPT-4 Series + - gpt-4 + - gpt-4-0314 + - gpt-4-32k + - gpt-4-32k-0314 +- GPT-3.5 Series + - gpt-3.5-turbo + - gpt-3.5-turbo-16k +- Others + - gpt-4o-mini + +### ZhipuAI +- GLM-4 Series + - glm-4-alltools + - glm-4-flash + - glm-4-0520 + - glm-4-plus + - glm-4 +- Vision Models + - glm-4v-plus + - glm-4v +- Embedding Models + - embedding-3 + +### Qwen +- Chat Models + - qwen2-57b-a14b-instruct + - qwen2-72b-instruct +- Vision Models + - qwen-vl-plus +- Embedding Models + - text-embedding-v1/v2/v3 + +### Siliconflow +- Qwen Series + - Qwen/Qwen2-7B-Instruct + +### Ollama +- Llama Series + - llama3.1:8b + +## How to Add New Model Support + +You can easily add support for new model providers by following these steps: + +### 1. Create Provider Directory + +Create a new directory under `backend/app/core/model_providers/` with your provider name: + +```bash +mkdir backend/app/core/model_providers/your_provider_name +``` + +### 2. Create Configuration File + +Inside your provider directory, create a `config.py` file: + +```python +from langchain_openai import ChatOpenAI # or other appropriate base class +from crewai import LLM +from app.models import ModelCategory, ModelCapability + +# Basic provider configuration +PROVIDER_CONFIG = { + "provider_name": "Your Provider Name", + "base_url": "https://api.your-provider.com/v1", + "api_key": "fake_api_key", # Default placeholder + "icon": "provider_icon", + "description": "Your Provider Description", +} + +# Define supported models +SUPPORTED_MODELS = [ + { + "name": "model-name-1", + "categories": [ModelCategory.LLM, ModelCategory.CHAT], + "capabilities": [], + }, + { + "name": "model-name-2", + "categories": [ModelCategory.LLM, ModelCategory.CHAT], + "capabilities": [ModelCapability.VISION], # For models with vision capabilities + }, + { + "name": "embedding-model", + "categories": [ModelCategory.TEXT_EMBEDDING], + "capabilities": [], + }, +] + +def init_model(model: str, temperature: float, openai_api_key: str, openai_api_base: str, **kwargs): + """Initialize a model for standard use""" + model_info = next((m for m in SUPPORTED_MODELS if m["name"] == model), None) + if model_info and ModelCategory.CHAT in model_info["categories"]: + return ChatOpenAI( + model=model, + temperature=temperature, + openai_api_key=openai_api_key, + openai_api_base=openai_api_base, + **kwargs, + ) + else: + raise ValueError(f"Model {model} is not supported as a chat model.") + +def init_crewai_model(model: str, openai_api_key: str, openai_api_base: str, **kwargs): + """Initialize a model for CrewAI use""" + model_info = next((m for m in SUPPORTED_MODELS if m["name"] == model), None) + if model_info and ModelCategory.CHAT in model_info["categories"]: + return LLM( + model=f"provider_name/{model}", # Format: provider/model + base_url=openai_api_base, + api_key=openai_api_key, + **kwargs, + ) + else: + raise ValueError(f"Model {model} is not supported as a chat model.") +``` + +### 3. Model Categories and Capabilities + +Available model categories: +```python +class ModelCategory(str, Enum): + LLM = "llm" + CHAT = "chat" + TEXT_EMBEDDING = "text-embedding" + RERANK = "rerank" + SPEECH_TO_TEXT = "speech-to-text" + TEXT_TO_SPEECH = "text-to-speech" +``` + +Available capabilities: +```python +class ModelCapability(str, Enum): + VISION = "vision" +``` + +### 4. Auto-Registration + +The `ModelProviderManager` will automatically discover and register your new provider when Flock starts up. It: +- Scans the model_providers directory +- Loads provider configurations +- Registers initialization functions +- Makes models available in the system + +### Best Practices + +1. **Configuration**: Keep provider-specific configuration in the config.py file +2. **Model Support**: Clearly define which models are supported and their capabilities +3. **Error Handling**: Include proper error handling in initialization functions +4. **Documentation**: Provide clear descriptions for your provider and models +5. **Testing**: Test both standard and CrewAI initialization paths + +### Example Implementation + +Here's a complete example for a new provider: + +```python +from langchain_openai import ChatOpenAI +from crewai import LLM +from app.models import ModelCategory, ModelCapability + +PROVIDER_CONFIG = { + "provider_name": "NewAI", + "base_url": "https://api.newai.com/v1", + "api_key": "fake_api_key", + "icon": "newai_icon", + "description": "NewAI - Next Generation Language Models", +} + +SUPPORTED_MODELS = [ + { + "name": "newai-chat-large", + "categories": [ModelCategory.LLM, ModelCategory.CHAT], + "capabilities": [], + }, + { + "name": "newai-vision", + "categories": [ModelCategory.LLM, ModelCategory.CHAT], + "capabilities": [ModelCapability.VISION], + }, +] + +def init_model(model: str, temperature: float, openai_api_key: str, openai_api_base: str, **kwargs): + model_info = next((m for m in SUPPORTED_MODELS if m["name"] == model), None) + if not model_info: + raise ValueError(f"Model {model} is not supported") + + if ModelCategory.CHAT in model_info["categories"]: + return ChatOpenAI( + model=model, + temperature=temperature, + openai_api_key=openai_api_key, + openai_api_base=openai_api_base, + **kwargs, + ) + else: + raise ValueError(f"Model {model} is not supported as a chat model") + +def init_crewai_model(model: str, openai_api_key: str, openai_api_base: str, **kwargs): + model_info = next((m for m in SUPPORTED_MODELS if m["name"] == model), None) + if not model_info: + raise ValueError(f"Model {model} is not supported") + + if ModelCategory.CHAT in model_info["categories"]: + return LLM( + model=f"newai/{model}", + base_url=openai_api_base, + api_key=openai_api_key, + **kwargs, + ) + else: + raise ValueError(f"Model {model} is not supported as a chat model") +``` \ No newline at end of file diff --git a/Add_New_Tools_Guide.md b/Add_New_Tools_Guide.md new file mode 100644 index 0000000..9f15f45 --- /dev/null +++ b/Add_New_Tools_Guide.md @@ -0,0 +1,174 @@ +# 🛠️ Tools Guide + +## Built-in Tools + +Flock comes with several built-in tools: + +### AI Service Tools +- **Web Search Pro**: Search the internet using ZhipuAI's web search capabilities +- **Qingyan Assistant**: A versatile AI assistant that can help with various tasks including: + - Data analysis + - Creating flowcharts + - Mind mapping + - Prompt engineering + - AI drawing + - AI search +- **Image Understanding**: Analyze and understand images using ZhipuAI's vision capabilities + +### Image Generation Tools +- **Spark Image Generation**: Generate images using Spark's API +- **Siliconflow Image Generation**: Generate images using Siliconflow's API + +### Utility Tools +- **Math Calculator**: Perform mathematical calculations locally using NumExpr +- **Google Translate**: Translate text between languages using Google Translate +- **Open Weather**: Get weather information for any city +- **Ask Human**: Request human intervention or input during execution + +### External Search Tools +- **DuckDuckGo Search**: Web search using DuckDuckGo +- **Wikipedia**: Search and retrieve information from Wikipedia + +## How to Add Custom Tools + +You can easily add new tools to Flock by following these steps: + +### 1. Create Tool Directory + +Create a new directory under `backend/app/core/tools/` with your tool name: + +```bash +mkdir backend/app/core/tools/your_tool_name +``` + +### 2. Create Tool Files + +Inside your tool directory, create these files: + +#### 2.1. `__init__.py` +```python +from .your_tool import your_tool_instance + +__all__ = ["your_tool_instance"] +``` + +#### 2.2. `your_tool.py` +```python +from langchain.pydantic_v1 import BaseModel, Field +from langchain.tools import StructuredTool + +class YourToolInput(BaseModel): + """Input schema for your tool.""" + param1: str = Field(description="Description of parameter 1") + param2: int = Field(description="Description of parameter 2") + +def your_tool_function(param1: str, param2: int) -> str: + """ + Your tool's main functionality. + """ + # Implement your tool's logic here + result = f"Processed {param1} with {param2}" + return result + +your_tool_instance = StructuredTool.from_function( + func=your_tool_function, + name="Your Tool Name", + description="Description of what your tool does", + args_schema=YourToolInput, + return_direct=True, +) +``` + +#### 2.3. `credentials.py` (Optional) +If your tool requires API keys or other credentials: + +```python +from typing import Any, Dict + +YOUR_TOOL_CREDENTIALS = { + "API_KEY": { + "type": "string", + "description": "API key for your service", + "value": "", + }, + "API_SECRET": { + "type": "string", + "description": "API secret for your service", + "value": "", + } +} + +def get_credentials() -> Dict[str, Any]: + return YOUR_TOOL_CREDENTIALS +``` + +### 3. Access Credentials in Your Tool + +If your tool needs to use credentials: + +```python +from app.core.tools.utils import get_credential_value + +def your_tool_function(param1: str, param2: int) -> str: + api_key = get_credential_value("Your Tool Name", "API_KEY") + api_secret = get_credential_value("Your Tool Name", "API_SECRET") + + if not api_key or not api_secret: + return "Error: Required credentials are not set." + + # Use credentials in your implementation + ... +``` + +### 4. Tool Registration + +Your tool will be automatically registered when Flock starts up, thanks to the tool manager system. The tool manager: +- Scans the tools directory +- Loads all tools with proper `__all__` exports +- Makes them available in the system + +### Best Practices + +1. **Input Validation**: Use Pydantic models to validate input parameters +2. **Error Handling**: Always include proper error handling in your tool +3. **Documentation**: Provide clear descriptions for your tool and its parameters +4. **Credentials**: If your tool requires API keys, use the credentials system +5. **Return Values**: Return clear, structured responses that can be easily processed + +### Example Tool Implementation + +Here's a complete example of a simple weather tool: + +```python +import requests +from langchain.pydantic_v1 import BaseModel, Field +from langchain.tools import StructuredTool +from app.core.tools.utils import get_credential_value + +class WeatherInput(BaseModel): + """Input for the weather tool.""" + city: str = Field(description="Name of the city") + +def get_weather(city: str) -> str: + """Get weather information for a city.""" + api_key = get_credential_value("Weather Tool", "API_KEY") + + if not api_key: + return "Error: Weather API Key is not set." + + try: + response = requests.get( + f"https://api.weather.com/data", + params={"city": city, "key": api_key} + ) + return response.json() + except Exception as e: + return f"Error getting weather data: {str(e)}" + +weather_tool = StructuredTool.from_function( + func=get_weather, + name="Weather Tool", + description="Get weather information for any city", + args_schema=WeatherInput, + return_direct=True, +) \ No newline at end of file diff --git a/README.md b/README.md index 64f73ab..dbca8be 100644 --- a/README.md +++ b/README.md @@ -7,6 +7,12 @@ Getting Started

+> [!TIP] +> +> ### 🎉 What's New +> +> **CrewAI Node Support**: Now you can leverage CrewAI's powerful multi-agent capabilities in your workflows! Create sophisticated agent teams and orchestrate complex collaborative tasks with ease. + A chatbot, RAG, agent, and multi-agent application project based on LangChain, LangGraph, and other frameworks, open-source, and capable of offline deployment. @@ -75,6 +81,16 @@ They are all excellent open-source projects, thanks🙇‍. Project tech stack: LangChain + LangGraph + React + Next.js + Chakra UI + PostgreSQL +> [!NOTE] +> +> ### 🤖 Model System +> +> Flock supports various model providers and makes it easy to add new ones. Check out our [Models Guide](Add_New_Model_Providers_Guide.md) to learn about supported models and how to add support for new providers. + +> ### 🛠️ Tools System +> +> Flock comes with various built-in tools and supports easy integration of custom tools. Check out our [Tools Guide](Add_New_Tools_Guide.md) to learn about available tools and how to add your own. + ### 💡RoadMap 1 APP @@ -83,7 +99,8 @@ Project tech stack: LangChain + LangGraph + React + Next.js + Chakra UI + Postgr - [x] SimpleRAG - [x] Hierarchical Agent - [x] Sequential Agent -- [ ] Work-Flow ---On Progress +- [x] Work-Flow +- [ ] CrewAI Integration ---On Progress - [ ] More muti-agent 2 Model @@ -92,7 +109,7 @@ Project tech stack: LangChain + LangGraph + React + Next.js + Chakra UI + Postgr - [x] ZhipuAI - [x] Siliconflow - [x] Ollama -- [ ] Qwen +- [x] Qwen - [ ] Xinference 3 Ohters diff --git a/README_cn.md b/README_cn.md index 3f3dece..911c500 100644 --- a/README_cn.md +++ b/README_cn.md @@ -7,6 +7,12 @@ 快速开始

+> [!TIP] +> +> ### 🎉 最新更新 +> +> **CrewAI 节点支持**: 现在您可以在工作流中使用 CrewAI 的强大多代理功能!轻松创建复杂的代理团队并编排复杂的协作任务。 + 一个基于 LangChain、LangGraph 和其他框架的聊天机器人、RAG、代理和多代理应用项目,开源且能够离线部署。 @@ -14,6 +20,7 @@ ![alt text](assets/login.jpg) ### 🤖️ 概览 + ![alt text](assets/image.png) ### 工作流 @@ -23,10 +30,10 @@ ### 节点类型和功能 -Flock的工作流系统由各种类型的节点组成,每种节点都有特定的用途: +Flock 的工作流系统由各种类型的节点组成,每种节点都有特定的用途: 1. 输入节点:处理初始输入并将其转换为工作流可处理的格式。 -2. LLM节点:利用大型语言模型进行文本生成和处理。 +2. LLM 节点:利用大型语言模型进行文本生成和处理。 3. 检索节点:从知识库中获取相关信息。 4. 工具节点:执行特定的任务或操作,扩展工作流功能。 5. 检索工具节点:结合检索能力和工具功能。 @@ -35,6 +42,7 @@ Flock的工作流系统由各种类型的节点组成,每种节点都有特定 8. 开始和结束节点:标记工作流的开始和结束。 未来计划添加的节点包括: + - 意图识别节点 - 条件分支节点(If-Else) - 文件上传节点 @@ -46,15 +54,20 @@ Flock的工作流系统由各种类型的节点组成,每种节点都有特定 ### Agent Chat ![image](https://github.com/user-attachments/assets/4097b087-0309-4aab-8be9-a06fdc9d4964) + ### 图像 + ![image](https://github.com/user-attachments/assets/ff6d6c92-dca8-4811-83ef-786272c46dfb) + ### 知识检索 +

image image

### Human-in-the-Loop(人工审批或让 LLM 重新思考或寻求人工帮助) + Flock 旨在成为一个开源的大语言模型(LLM)应用开发平台。它是一个基于 LangChain 和 LangGraph 概念的 LLM 应用。目标是创建一套支持聊天机器人、RAG 应用、代理和多代理系统的 LLMOps 解决方案,并具备离线运行能力。 受 [StreetLamb](https://github.com/StreetLamb) 项目及其 [tribe](https://github.com/StreetLamb/tribe) 项目的启发,Flock 采用了许多相同的方法和代码。在此基础上,它引入了一些新的功能和方向。 @@ -65,15 +78,26 @@ Flock 旨在成为一个开源的大语言模型(LLM)应用开发平台。 项目技术栈:LangChain + LangGraph + React + Next.js + Chakra UI + PostgreSQL +> [!NOTE] +> +> ### 🤖 模型系统 +> +> Flock 支持多种模型提供商,并且可以轻松添加新的提供商。查看我们的[模型指南](Add_New_Model_Providers_Guide.md)了解支持的模型以及如何添加新的提供商支持。 + +> ### 🛠️ 工具系统 +> +> Flock 内置了多种工具,并支持轻松集成自定义工具。查看我们的[工具指南](Add_New_Tools_Guide.md)了解可用工具和如何添加自己的工具。 + ### 💡 路线图 1 应用 - [x] 聊天机器人 - [x] 简单 RAG -- [x] 层次代理 +- [x] 层次理 - [x] 顺序代理 -- [ ] 工作流 ---进行中 +- [x] 工作流 +- [ ] CrewAI 集成 ---进行中 - [ ] 更多多代理系统 2 模型 @@ -82,7 +106,7 @@ Flock 旨在成为一个开源的大语言模型(LLM)应用开发平台。 - [x] ZhipuAI - [x] Siliconflow - [x] Ollama -- [ ] Qwen +- [x] Qwen - [ ] Xinference 3 其他 @@ -115,7 +139,7 @@ git clone https://github.com/Onelevenvy/flock.git cp .env.example .env ``` -##### 1.3 ��成密钥 +##### 1.3 生成密钥 .env 文件中的一些环境变量默认值为 changethis。 您必须将它们更改为密钥,要生成密钥,可以运行以下命令: diff --git a/README_ja.md b/README_ja.md index 7ab2242..074e858 100644 --- a/README_ja.md +++ b/README_ja.md @@ -7,6 +7,12 @@ 始め方

+> [!TIP] +> +> ### 🎉 最新アップデート +> +> **CrewAI ノードのサポート**: ワークフローで CrewAI の強力なマルチエージェント機能を活用できるようになりました!洗練されたエージェントチームを作成し、複雑な協調タスクを簡単に編成できます。 + LangChain、LangGraph、およびその他のフレームワークに基づいたチャットボット、RAG、エージェント、およびマルチエージェントアプリケーションプロジェクトで、オープンソースであり、オフライン展開が可能です。 @@ -14,7 +20,9 @@ LangChain、LangGraph、およびその他のフレームワークに基づい ![alt text](assets/login.jpg) ### 🤖️ 概要 + ![alt text](assets/image.png) + #### ワークフロー ![image](https://github.com/user-attachments/assets/a4e33565-7acf-45d9-8e82-5a740cd88344) @@ -22,18 +30,19 @@ LangChain、LangGraph、およびその他のフレームワークに基づい ### ノードタイプと機能 -Flockのワークフローシステムは、様々なタイプのノードで構成されており、それぞれが特定の目的を果たします: +Flock のワークフローシステムは、様々なタイプのノードで構成されており、それぞれが特定の目的を果たします: 1. 入力ノード:初期入力を処理し、ワークフローが扱える形式に変換します。 -2. LLMノード:大規模言語モデルを利用してテキスト生成と処理を行います。 +2. LLM ノード:大規模言語モデルを利用してテキスト生成と処理を行います。 3. 検索ノード:知識ベースから関連情報を取得します。 4. ツールノード:特定のタスクや操作を実行し、ワークフローの機能を拡張します。 5. 検索ツールノード:検索機能とツール機能を組み合わせます。 6. 回答ノード:前のノードの結果を統合し、最終的な回答や出力を生成します。 -7. サブグラフノード:完全なサブワークフローをカプセル化し、モジュラー設計を可能にします。 +7. サブグラフノード:完全なサブワークフロ ��� をカプセル化し、モジュラー設計を可能にします。 8. 開始と終了ノード:ワークフローの開始と終了を示します。 将来計画されているノードには以下が含まれます: + - 意図認識ノード - 条件分岐ノード(If-Else) - ファイルアップロードノード @@ -63,6 +72,16 @@ Flock は、大規模言語モデル(LLM)アプリケーションを開発 プロジェクトの技術スタック:LangChain + LangGraph + React + Next.js + Chakra UI + PostgreSQL +> [!NOTE] +> +> ### 🛠️ ツールシステム +> +> Flock には様々な組み込みツールが付属しており、カスタムツールの簡単な統合をサポートしています。利用可能なツールと独自のツールの追加方法については、[ツールガイド](Add_New_Tools_Guide.md)をご覧ください。 + +> ### 🤖 モデルシステム +> +> Flock は様々なモデルプロバイダーをサポートしており、新しいプロバイダーの追加も容易です。サポートされているモデルと新しいプロバイダーの追加方法については、[モデルガイド](Add_New_Model_Providers_Guide.md)をご覧ください。 + ### 💡 ロードマップ 1 アプリ @@ -71,7 +90,8 @@ Flock は、大規模言語モデル(LLM)アプリケーションを開発 - [x] シンプル RAG - [x] 階層エージェント - [x] シーケンシャルエージェント -- [ ] ワークフロー ---進行中 +- [x] ワークフロー +- [ ] CrewAI 統合 ---進行中 - [ ] さらに多くのマルチエージェント 2 モデル @@ -80,7 +100,7 @@ Flock は、大規模言語モデル(LLM)アプリケーションを開発 - [x] ZhipuAI - [x] Siliconflow - [x] Ollama -- [ ] Qwen +- [x] Qwen - [ ] Xinference 3 その他 @@ -137,7 +157,7 @@ docker compose --env-file ../.env up -d サーバーの起動には Python 3.10.x が必要です。Python 環境を迅速にインストールするには、pyenv を使用することをお勧めします。 -追加の Python バージョンをインストールするには、pyenv install を使用します。 +追加の Python ージョンをインストールするには、pyenv install を使用します。 ```bash pyenv install 3.10