pypi_0.5.0 Support llama3, fix LLM interface
What's Changed
- (fix): If the model is a string, a cost calculation error will be raised by @sudoskys in #74
- (feat): Novelai tokenizer re-implement || New LLM by @sudoskys in #75
Preset
Preset -> https://github.com/LlmKira/novelai-python/blob/main/src/novelai_python/sdk/ai/generate/_enum.py#L199
async def chat(prompt: str):
try:
model = TextLLMModel.ERATO # llama3
parameters = get_default_preset(model).parameters
agent = LLM.build(
prompt=prompt,
model=model,
parameters=None # Auto Select or get from preset
)
result = await agent.request(session=login_credential)
except APIError as e:
raise Exception(f"Error: {e.message}")
print(f"Result: \n{result.text}")
Full Changelog: pypi_0.4.16...pypi_0.5.0