Skip to content

pypi_0.5.1 Support llama3, new endpoint

Compare
Choose a tag to compare
@sudoskys sudoskys released this 26 Sep 12:28
· 14 commits to main since this release
7c71f0c

What's Changed

  • fix(0.5.1): add tokenizer.get_vocab() | Diff generation endpoint / text.novelai.net or api.novelai.net by @sudoskys in #76
  • (fix): If the model is a string, a cost calculation error will be raised by @sudoskys in #74
  • (feat): Novelai tokenizer re-implement || New LLM by @sudoskys in #75

Preset

image

Preset -> https://github.com/LlmKira/novelai-python/blob/main/src/novelai_python/sdk/ai/generate/_enum.py#L199

async def chat(prompt: str):
    try:
        model = TextLLMModel.ERATO  # llama3
        parameters = get_default_preset(model).parameters
        agent = LLM.build(
            prompt=prompt,
            model=model,
            parameters=None  # Auto Select or get from preset
        )
        result = await agent.request(session=login_credential)
    except APIError as e:
        raise Exception(f"Error: {e.message}")
    print(f"Result: \n{result.text}")
  

Full Changelog: pypi_0.5.0...pypi_0.5.1