Skip to content
/ locallm Public

An api to query local language models using different backends or the browser

License

Notifications You must be signed in to change notification settings

synw/locallm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

80 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LocalLm

An api to query local language models using different backends

Version Name Description Doc
pub package @locallm/types The shared data types Api doc - Readme
pub package @locallm/api Run local language models using different backends Api doc - Readme
pub package @locallm/browser Run quantitized language models inside the browser Api doc - Readme

Supported backends

Quickstart

Api

npm install @locallm/api
# or
yarn add @locallm/api

Example with the Koboldcpp provider:

import { Lm } from "@locallm/api";

const lm = new Lm({
  providerType: "koboldcpp",
  serverUrl: "http://localhost:5001",
  onToken: (t) => process.stdout.write(t),
});
const template = "<s>[INST] {prompt} [/INST]";
const _prompt = template.replace("{prompt}", "list the planets in the solar system");
// run the inference query
const res = await lm.infer(_prompt, {
  temperature: 0,
  top_p: 0.35,
  n_predict: 200,
});
console.log(res);

Examples

Check the examples directory for more examples

About

An api to query local language models using different backends or the browser

Resources

License

Stars

Watchers

Forks