Skip to content

Commit

Permalink
Merge pull request #26 from dbpunk-labs/25-running-progress-support-f…
Browse files Browse the repository at this point in the history
…or-the-terminal

fix: add spinner for the last span
  • Loading branch information
imotai authored Sep 20, 2023
2 parents 393abe9 + 8fbb062 commit f5d9fc3
Show file tree
Hide file tree
Showing 29 changed files with 1,644 additions and 917 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,9 +23,9 @@ jobs:
WS_DIR=`pwd`
bash start_sandbox.sh
cd ${WS_DIR}/kernel
pytest tests/*
pytest tests/*.py
cd ${WS_DIR}/agent
pytest tests/*
pytest tests/*.py
- uses: actions/upload-artifact@v3
if: failure()
with:
Expand Down
44 changes: 29 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,37 +3,51 @@

![GitHub Workflow Status (with event)](https://img.shields.io/github/actions/workflow/status/dbpunk-labs/octopus/ci.yml?branch=main&style=flat-square)
[![Discord](https://badgen.net/badge/icon/discord?icon=discord&label)](https://discord.gg/UjSHsjaz66)
[![Twitter Follow](https://img.shields.io/twitter/follow/OCopilot7817?style=flat-square)](https://twitter.com/OCopilot7817)

Octopus is an open-source code interpreter. you can deploy it to your remote server or your PC and access it with octopus client
> ## Octopus
> an open-source code interpreter for terminal users
<p align="center">
<img width="800px" src="https://github.com/dbpunk-labs/octopus/assets/8623385/709f84f6-3b7f-49cf-b83f-e26d2d802015" align="center"/>
<img width="1000px" src="https://github.com/dbpunk-labs/octopus/assets/8623385/bc6ed982-9d5c-473d-8efe-dbe6961b200d" align="center"/>

## Getting Started

### Install

there two ways to install octopus


## How It works

Core components
![octopus_simple](https://github.com/dbpunk-labs/octopus/assets/8623385/e5bfb3fb-74a5-4c60-8842-a81ee54fcb9d)

* Kernel: The code execution engine, based on notebook kernels.
* Agent: Manages client requests, uses ReAct to process complex tasks, and stores user-assembled applications.
* Chat: Accepts user requests, sends them to the Agent, and renders rich results. Currently supports Discord, iTerm2, and Kitty terminals.
* Octopus Kernel: The code execution engine, based on notebook kernels.
* Octopus Agent: Manages client requests, uses ReAct to process complex tasks, and stores user-assembled applications.
* Octopus Terminal Cli: Accepts user requests, sends them to the Agent, and renders rich results. Currently supports Discord, iTerm2, and Kitty terminals.

For security, it is recommended to run the kernel and agent as Docker containers.

<p align="center">
<img width="800px" src="https://github.com/dbpunk-labs/octopus/assets/8623385/b67ce64e-4ca8-41b0-9cb8-3b13610ff970" align="center"/>

## Demo

[video](https://github.com/dbpunk-labs/octopus/assets/8623385/1b7a47e5-8ac9-4d42-9eb2-848b47b8db84)
[video](https://github.com/dbpunk-labs/octopus/assets/8623385/bea76119-a705-4ae1-907d-cb4e0a0c18a5)


### API Service Supported

|name|status| note|
|----|----------------|---|
|[Openai GPT 3.5/4](https://openai.com/product#made-for-developers) | ✅ fully supported|the detail installation steps|
|[Azure Openai GPT 3.5/4](https://azure.microsoft.com/en-us/products/ai-services/openai-service) | ✅ fully supported|the detail install steps|
|[LLama.cpp Server](https://github.com/ggerganov/llama.cpp/tree/master/examples/server) | ✅ fully supported| You must provide the model|

### Tested Platform

### LLM
|name|status| note|
|----|----------------|---|
|ubuntu 22.04 | ✅ fully supported|the detail installation steps|
|macos | ✅ fully supported|the detail install steps|

|name|type|supported status|
|----|----|----------------|
|GPT 3.5/4 | LLM | ✅ fully supported|
|Codellama | LLM | ✅ fully supported|

### Deployment

Expand Down
5 changes: 2 additions & 3 deletions agent/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
setup(
name="octopus_agent",
version="0.3.6",
description="Open source code interpreter agent for LLM",
description="Open source code interpreter agent",
author="imotai",
author_email="[email protected]",
url="https://github.com/dbpunk-labs/octopus",
Expand All @@ -33,15 +33,14 @@
install_requires=[
"octopus_proto",
"octopus_kernel",
"langchain>=0.0.286",
"grpcio-tools>=1.57.0",
"grpc-google-iam-v1>=0.12.6",
"aiofiles",
"orm[sqlite]",
"python-dotenv",
"openai",
"json-stream",
"aiohttp>=3.8.5",
"replicate",
],
package_data={"octopus_agent": ["*.bnf"]},
entry_points={
Expand Down
56 changes: 56 additions & 0 deletions agent/src/octopus_agent/agent_builder.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# vim:fenc=utf-8
#
# Copyright (C) 2023 dbpunk.com Author imotai <[email protected]>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

""" """
import json
from .prompt import OCTOPUS_FUNCTION_SYSTEM, OCTOPUS_CODELLAMA_SYSTEM
from .codellama_agent import CodellamaAgent
from .openai_agent import OpenaiAgent
from .codellama_client import CodellamaClient
from .mock_agent import MockAgent

def build_codellama_agent(endpoint, key, sdk, grammer_path):
"""
build codellama agent
"""
with open(grammer_path, "r") as fd:
grammar = fd.read()

client = CodellamaClient(
endpoint, key, OCTOPUS_CODELLAMA_SYSTEM, "Octopus", "User", grammar
)

# init the agent
return CodellamaAgent(client, sdk)


def build_openai_agent(sdk, model_name):
"""build openai function call agent"""
# TODO a data dir per user
# init the agent

agent = OpenaiAgent(model_name, OCTOPUS_FUNCTION_SYSTEM, sdk)
return agent


def build_mock_agent(sdk, cases_path):
"""
build the mock agent for testing
"""
with open(cases_path, "r") as fd:
messages = json.load(fd)
agent = MockAgent(messages, sdk)
return agent
71 changes: 9 additions & 62 deletions agent/src/octopus_agent/agent_llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,7 @@
""" """

import logging
from langchain.chat_models import AzureChatOpenAI, ChatOpenAI
from langchain.llms.fake import FakeListLLM
import openai

logger = logging.getLogger(__name__)

Expand All @@ -37,8 +36,6 @@ def __init__(self, config):
self._build_azure_openai()
elif self.config["llm_key"] == "openai":
self._build_openai()
elif self.config["llm_key"] == "mock":
self._build_mock_llm()

def get_llm(self):
return self.llms.get(self.llm_key, None)
Expand All @@ -59,17 +56,9 @@ def _build_openai(self):
"openai_api_key",
"openai_api_model",
])
api_base = self.config.get("openai_api_base", None)
api_key = self.config["openai_api_key"]
api_model = self.config["openai_api_model"]
temperature = self.config.get("temperature", 0)
llm = ChatOpenAI(
openai_api_base=api_base,
openai_api_key=api_key,
model_name=api_model,
temperature=temperature,
)
self.llms[self.llm_key] = llm
if self.config.get("openai_api_base", None):
openai.api_base = self.config.get("openai_api_base", None)
openai.api_key = self.config["openai_api_key"]

def _build_azure_openai(self):
"""
Expand All @@ -82,50 +71,8 @@ def _build_azure_openai(self):
"openai_api_type",
"openai_api_deployment",
])
api_base = self.config["openai_api_base"]
api_version = self.config["openai_api_version"]
api_type = self.config["openai_api_type"]
api_key = self.config["openai_api_key"]
api_deployment = self.config["openai_api_deployment"]
temperature = self.config.get("temperature", 0)
verbose = self.config.get("verbose", False)
llm = AzureChatOpenAI(
openai_api_base=api_base,
openai_api_version=api_version,
openai_api_key=api_key,
openai_api_type=api_type,
deployment_name=api_deployment,
temperature=temperature,
verbose=verbose,
)
self.llms[self.llm_key] = llm

def _build_mock_llm(self):
"""
build a mock llm
"""
# the response to "how to get metadata from python grpc request"
# TODO config the response
responses = [
"""Final Answer: To get metadata from a Python gRPC request context, you can access the `context.invocation_metadata()` method. This method returns a list of key-value pairs representing the metadata associated with the request.
Here's an example of how you can retrieve metadata from a gRPC request context:
```python
def my_grpc_method(request, context):
# Get the metadata from the request context
metadata = dict(context.invocation_metadata())
# Access specific metadata values
value = metadata.get('key')
# Print the metadata
print(metadata)
```
In this example, `context.invocation_metadata()` returns a list of tuples representing the metadata. By converting it to a dictionary using `dict()`, you can easily access specific metadata values using their keys.
Note that the `context` parameter in the example represents the gRPC request context object passed to the gRPC method."""
]
llm = FakeListLLM(responses=responses)
self.llms["mock"] = llm
openai.api_base = self.config["openai_api_base"]
openai.api_version = self.config["openai_api_version"]
openai.api_type = self.config["openai_api_type"]
openai.api_key = self.config["openai_api_key"]
self.config["openai_api_model"] = self.config["openai_api_deployment"]
Loading

0 comments on commit f5d9fc3

Please sign in to comment.