Skip to content

pypi_0.5.0 Support llama3, fix LLM interface

Compare
Choose a tag to compare
@sudoskys sudoskys released this 26 Sep 10:50
· 18 commits to main since this release
9cbed20

What's Changed

  • (fix): If the model is a string, a cost calculation error will be raised by @sudoskys in #74
  • (feat): Novelai tokenizer re-implement || New LLM by @sudoskys in #75

Preset

image

Preset -> https://github.com/LlmKira/novelai-python/blob/main/src/novelai_python/sdk/ai/generate/_enum.py#L199

async def chat(prompt: str):
    try:
        model = TextLLMModel.ERATO  # llama3
        parameters = get_default_preset(model).parameters
        agent = LLM.build(
            prompt=prompt,
            model=model,
            parameters=None  # Auto Select or get from preset
        )
        result = await agent.request(session=login_credential)
    except APIError as e:
        raise Exception(f"Error: {e.message}")
    print(f"Result: \n{result.text}")
  

Full Changelog: pypi_0.4.16...pypi_0.5.0