Skip to content

initial release: Python

Compare
Choose a tag to compare
@rudolfolah rudolfolah released this 07 Aug 14:16
· 30 commits to main since this release

The initial version of Chaincrafter has been released and published for Python. It currently supports OpenAI, with plans to support local LLMs such as gpt4all and llama.cpp.

Using OpenAI was simplest to test out the asynchronous support with asyncio and the experiments.

Async support

Async lets you make multiple requests to the LLM at the same time:

Example Code
import asyncio

from chaincrafter import Chain, Prompt
from chaincrafter.models import OpenAiChat

chat_model = OpenAiChat(
    temperature=0.9,
    model_name="gpt-3.5-turbo",
    presence_penalty=0.1,
    frequency_penalty=0.2,
)


def make_chain(country):
    system_prompt = Prompt("You are a helpful assistant who responds to questions about the world")
    followup_prompt = Prompt("{city} sounds like a nice place to visit. What is the population of {city}?")
    hello_prompt = Prompt(f"Hello, what is the capital of {country}? Answer only with the city name.")
    return Chain(
        system_prompt,
        (hello_prompt, "city"),
        (followup_prompt, "followup_response"),
    )


async def main():
    chain_france = make_chain("France")
    chain_china = make_chain("China")
    results = await asyncio.gather(
        chain_france.async_run(chat_model),
        chain_china.async_run(chat_model),
    )
    for messages in results:
        for message in messages:
            print(f"{message['role']}: {message['content']}")

asyncio.run(main())

Experiments support

Experiments allow you to test combinations of model parameters with the same prompt. You can use this to compare models, model parameters and to compare them over time.

Example Code
from chaincrafter import Chain, Prompt
from chaincrafter.experiments import OpenAiChatExperiment

system_prompt = Prompt("You are a helpful assistant who responds to questions about the world")
hello_prompt = Prompt("Hello, what is the capital of France? Answer only with the city name.")
followup_prompt = Prompt("{city} sounds like a nice place to visit. What is the population of {city}?")
chain = Chain(
    system_prompt,
    (hello_prompt, "city"),
    (followup_prompt, "followup_response"),
)
experiment = OpenAiChatExperiment(
    chain,
    model_name=["gpt-4", "gpt-3.5-turbo"],
    temperature=[0.7, 1.5],
    presence_penalty=[0.1],
    frequency_penalty=[0.2],
)
experiment.run()
print(experiment.results)
# CSV Output
print(experiment.to_csv())
# JSON Output
print(experiment.to_json())
# Pandas DataFrame Output
print(experiment.to_pandas_df())
# Pandas DataFrame Visualize
print(experiment.visualize())

Links

stable-diffusion-xl (5)