pypi_0.5.1 Support llama3, new endpoint
What's Changed
- fix(0.5.1): add tokenizer.get_vocab() | Diff generation endpoint / text.novelai.net or api.novelai.net by @sudoskys in #76
- (fix): If the model is a string, a cost calculation error will be raised by @sudoskys in #74
- (feat): Novelai tokenizer re-implement || New LLM by @sudoskys in #75
Preset
Preset -> https://github.com/LlmKira/novelai-python/blob/main/src/novelai_python/sdk/ai/generate/_enum.py#L199
async def chat(prompt: str):
try:
model = TextLLMModel.ERATO # llama3
parameters = get_default_preset(model).parameters
agent = LLM.build(
prompt=prompt,
model=model,
parameters=None # Auto Select or get from preset
)
result = await agent.request(session=login_credential)
except APIError as e:
raise Exception(f"Error: {e.message}")
print(f"Result: \n{result.text}")
Full Changelog: pypi_0.5.0...pypi_0.5.1