Skip to content

Commit

Permalink
Merge branch 'master' into update/PIPER_VERSION
Browse files Browse the repository at this point in the history
  • Loading branch information
dave-gray101 authored Oct 30, 2024
2 parents a508ec8 + 88edb1e commit d3d37aa
Show file tree
Hide file tree
Showing 164 changed files with 36 additions and 8,970 deletions.
File renamed without changes.
11 changes: 11 additions & 0 deletions .bruno/LocalAI Test Requests/model gallery/model delete.bru
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
meta {
name: model delete
type: http
seq: 7
}

post {
url: {{PROTOCOL}}{{HOST}}:{{PORT}}/models/galleries
body: none
auth: none
}
Binary file not shown.
16 changes: 16 additions & 0 deletions .bruno/LocalAI Test Requests/transcription/transcribe.bru
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
meta {
name: transcribe
type: http
seq: 1
}

post {
url: {{PROTOCOL}}{{HOST}}:{{PORT}}/v1/audio/transcriptions
body: multipartForm
auth: none
}

body:multipart-form {
file: @file(transcription/gb1.ogg)
model: whisper-1
}
File renamed without changes.
4 changes: 2 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -8,15 +8,15 @@ DETECT_LIBS?=true
# llama.cpp versions
GOLLAMA_REPO?=https://github.com/go-skynet/go-llama.cpp
GOLLAMA_VERSION?=2b57a8ae43e4699d3dc5d1496a1ccd42922993be
CPPLLAMA_VERSION?=61715d5cc83a28181df6a641846e4f6a740f3c74
CPPLLAMA_VERSION?=8f275a7c4593aa34147595a90282cf950a853690

# go-rwkv version
RWKV_REPO?=https://github.com/donomii/go-rwkv.cpp
RWKV_VERSION?=661e7ae26d442f5cfebd2a0881b44e8c55949ec6

# whisper.cpp version
WHISPER_REPO?=https://github.com/ggerganov/whisper.cpp
WHISPER_CPP_VERSION?=d4bc413505b2fba98dffbb9a176ddd1b165941d0
WHISPER_CPP_VERSION?=55e422109b3504d1a824935cc2681ada7ee9fd38

# bert.cpp version
BERT_REPO?=https://github.com/go-skynet/go-bert.cpp
Expand Down
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@

> :bulb: Get help - [❓FAQ](https://localai.io/faq/) [💭Discussions](https://github.com/go-skynet/LocalAI/discussions) [:speech_balloon: Discord](https://discord.gg/uJAeKSAGDy) [:book: Documentation website](https://localai.io/)
>
> [💻 Quickstart](https://localai.io/basics/getting_started/) [🖼️ Models](https://models.localai.io/) [🚀 Roadmap](https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3Aroadmap) [🥽 Demo](https://demo.localai.io) [🌍 Explorer](https://explorer.localai.io) [🛫 Examples](https://github.com/go-skynet/LocalAI/tree/master/examples/)
> [💻 Quickstart](https://localai.io/basics/getting_started/) [🖼️ Models](https://models.localai.io/) [🚀 Roadmap](https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3Aroadmap) [🥽 Demo](https://demo.localai.io) [🌍 Explorer](https://explorer.localai.io) [🛫 Examples](https://github.com/mudler/LocalAI-examples)
[![tests](https://github.com/go-skynet/LocalAI/actions/workflows/test.yml/badge.svg)](https://github.com/go-skynet/LocalAI/actions/workflows/test.yml)[![Build and Release](https://github.com/go-skynet/LocalAI/actions/workflows/release.yaml/badge.svg)](https://github.com/go-skynet/LocalAI/actions/workflows/release.yaml)[![build container images](https://github.com/go-skynet/LocalAI/actions/workflows/image.yml/badge.svg)](https://github.com/go-skynet/LocalAI/actions/workflows/image.yml)[![Bump dependencies](https://github.com/go-skynet/LocalAI/actions/workflows/bump_deps.yaml/badge.svg)](https://github.com/go-skynet/LocalAI/actions/workflows/bump_deps.yaml)[![Artifact Hub](https://img.shields.io/endpoint?url=https://artifacthub.io/badge/repository/localai)](https://artifacthub.io/packages/search?repo=localai)

Expand Down Expand Up @@ -85,6 +85,7 @@ local-ai run oci://localai/phi-2:latest

## 📰 Latest project news

- Oct 2024: examples moved to [LocalAI-examples](https://github.com/mudler/LocalAI-examples)
- Aug 2024: 🆕 FLUX-1, [P2P Explorer](https://explorer.localai.io)
- July 2024: 🔥🔥 🆕 P2P Dashboard, LocalAI Federated mode and AI Swarms: https://github.com/mudler/LocalAI/pull/2723
- June 2024: 🆕 You can browse now the model gallery without LocalAI! Check out https://models.localai.io
Expand Down
5 changes: 0 additions & 5 deletions backend/cpp/llama/grpc-server.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -670,7 +670,6 @@ struct llama_server_context
slot->sparams.top_k = json_value(data, "top_k", default_sparams.top_k);
slot->sparams.top_p = json_value(data, "top_p", default_sparams.top_p);
slot->sparams.min_p = json_value(data, "min_p", default_sparams.min_p);
slot->sparams.tfs_z = json_value(data, "tfs_z", default_sparams.tfs_z);
slot->sparams.typ_p = json_value(data, "typical_p", default_sparams.typ_p);
slot->sparams.temp = json_value(data, "temperature", default_sparams.temp);
slot->sparams.dynatemp_range = json_value(data, "dynatemp_range", default_sparams.dynatemp_range);
Expand Down Expand Up @@ -1206,7 +1205,6 @@ struct llama_server_context
{"top_k", slot.sparams.top_k},
{"top_p", slot.sparams.top_p},
{"min_p", slot.sparams.min_p},
{"tfs_z", slot.sparams.tfs_z},
{"typical_p", slot.sparams.typ_p},
{"repeat_last_n", slot.sparams.penalty_last_n},
{"repeat_penalty", slot.sparams.penalty_repeat},
Expand Down Expand Up @@ -2105,7 +2103,6 @@ json parse_options(bool streaming, const backend::PredictOptions* predict, llama
// slot->params.n_predict = json_value(data, "n_predict", default_params.n_predict);
// slot->sparams.top_k = json_value(data, "top_k", default_sparams.top_k);
// slot->sparams.top_p = json_value(data, "top_p", default_sparams.top_p);
// slot->sparams.tfs_z = json_value(data, "tfs_z", default_sparams.tfs_z);
// slot->sparams.typical_p = json_value(data, "typical_p", default_sparams.typical_p);
// slot->sparams.temp = json_value(data, "temperature", default_sparams.temp);
// slot->sparams.penalty_last_n = json_value(data, "repeat_last_n", default_sparams.penalty_last_n);
Expand All @@ -2129,7 +2126,6 @@ json parse_options(bool streaming, const backend::PredictOptions* predict, llama
data["n_predict"] = predict->tokens() == 0 ? -1 : predict->tokens();
data["top_k"] = predict->topk();
data["top_p"] = predict->topp();
data["tfs_z"] = predict->tailfreesamplingz();
data["typical_p"] = predict->typicalp();
data["temperature"] = predict->temperature();
data["repeat_last_n"] = predict->repeat();
Expand Down Expand Up @@ -2176,7 +2172,6 @@ json parse_options(bool streaming, const backend::PredictOptions* predict, llama
// llama.params.n_predict = predict->tokens() == 0 ? -1 : predict->tokens();
// llama.params.sparams.top_k = predict->topk();
// llama.params.sparams.top_p = predict->topp();
// llama.params.sparams.tfs_z = predict->tailfreesamplingz();
// llama.params.sparams.typical_p = predict->typicalp();
// llama.params.sparams.penalty_last_n = predict->repeat();
// llama.params.sparams.temp = predict->temperature();
Expand Down
2 changes: 1 addition & 1 deletion core/http/app_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -438,7 +438,7 @@ var _ = Describe("API test", func() {
Eventually(func() bool {
response := getModelStatus("http://127.0.0.1:9090/models/jobs/" + uuid)
return response["processed"].(bool)
}, "360s", "10s").Should(Equal(true))
}, "900s", "10s").Should(Equal(true))

Eventually(func() []string {
models, _ := client.ListModels(context.TODO())
Expand Down
2 changes: 1 addition & 1 deletion embedded/model_library.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@
### For models with an entire YAML file to be embededd, put the file inside the `models`
### directory, it will be automatically available with the file name as key (without the .yaml extension)

phi-2: "github://mudler/LocalAI/examples/configurations/phi-2.yaml@master"
phi-2: "github://mudler/LocalAI-examples/configurations/phi-2.yaml@main"
191 changes: 2 additions & 189 deletions examples/README.md
Original file line number Diff line number Diff line change
@@ -1,190 +1,3 @@
# Examples
# LocalAI Examples

| [ChatGPT OSS alternative](https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-ui) | [Image generation](https://localai.io/api-endpoints/index.html#image-generation) |
|------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------|
| ![Screenshot from 2023-04-26 23-59-55](https://user-images.githubusercontent.com/2420543/234715439-98d12e03-d3ce-4f94-ab54-2b256808e05e.png) | ![b6441997879](https://github.com/go-skynet/LocalAI/assets/2420543/d50af51c-51b7-4f39-b6c2-bf04c403894c) |

| [Telegram bot](https://github.com/go-skynet/LocalAI/tree/master/examples/telegram-bot) | [Flowise](https://github.com/go-skynet/LocalAI/tree/master/examples/flowise) |
|------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------|
![Screenshot from 2023-06-09 00-36-26](https://github.com/go-skynet/LocalAI/assets/2420543/e98b4305-fa2d-41cf-9d2f-1bb2d75ca902) | ![Screenshot from 2023-05-30 18-01-03](https://github.com/go-skynet/LocalAI/assets/2420543/02458782-0549-4131-971c-95ee56ec1af8)| |

Here is a list of projects that can easily be integrated with the LocalAI backend.


### Projects

### AutoGPT

_by [@mudler](https://github.com/mudler)_

This example shows how to use AutoGPT with LocalAI.

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/autoGPT/)

### Chatbot-UI

_by [@mkellerman](https://github.com/mkellerman)_

![Screenshot from 2023-04-26 23-59-55](https://user-images.githubusercontent.com/2420543/234715439-98d12e03-d3ce-4f94-ab54-2b256808e05e.png)

This integration shows how to use LocalAI with [mckaywrigley/chatbot-ui](https://github.com/mckaywrigley/chatbot-ui).

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-ui/)

There is also a separate example to show how to manually setup a model: [example](https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-ui-manual/)

### K8sGPT

_by [@mudler](https://github.com/mudler)_

This example show how to use LocalAI inside Kubernetes with [k8sgpt](https://k8sgpt.ai).

![Screenshot from 2023-06-19 23-58-47](https://github.com/go-skynet/go-ggml-transformers.cpp/assets/2420543/cab87409-ee68-44ae-8d53-41627fb49509)

### Fine-tuning a model and convert it to gguf to use it with LocalAI

_by [@mudler](https://github.com/mudler)_

This example is an e2e example on how to fine-tune a model with [axolotl](https://github.com/OpenAccess-AI-Collective/axolotl) and convert it to gguf to use it with LocalAI.

[Check it out here](https://github.com/mudler/LocalAI/tree/master/examples/e2e-fine-tuning/)

### Flowise

_by [@mudler](https://github.com/mudler)_

This example shows how to use [FlowiseAI/Flowise](https://github.com/FlowiseAI/Flowise) with LocalAI.

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/flowise/)

### Discord bot

_by [@mudler](https://github.com/mudler)_

Run a discord bot which lets you talk directly with a model

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/discord-bot/), or for a live demo you can talk with our bot in #random-bot in our discord server.

### Langchain

_by [@dave-gray101](https://github.com/dave-gray101)_

A ready to use example to show e2e how to integrate LocalAI with langchain

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/langchain/)

### Langchain Python

_by [@mudler](https://github.com/mudler)_

A ready to use example to show e2e how to integrate LocalAI with langchain

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/langchain-python/)

### LocalAI functions

_by [@mudler](https://github.com/mudler)_

A ready to use example to show how to use OpenAI functions with LocalAI

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/functions/)

### LocalAI WebUI

_by [@dhruvgera](https://github.com/dhruvgera)_

![image](https://user-images.githubusercontent.com/42107491/235344183-44b5967d-ba22-4331-804c-8da7004a5d35.png)

A light, community-maintained web interface for LocalAI

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/localai-webui/)

### How to run rwkv models

_by [@mudler](https://github.com/mudler)_

A full example on how to run RWKV models with LocalAI

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/rwkv/)

### PrivateGPT

_by [@mudler](https://github.com/mudler)_

A full example on how to run PrivateGPT with LocalAI

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/privateGPT/)

### Slack bot

_by [@mudler](https://github.com/mudler)_

Run a slack bot which lets you talk directly with a model

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/slack-bot/)

### Slack bot (Question answering)

_by [@mudler](https://github.com/mudler)_

Run a slack bot, ideally for teams, which lets you ask questions on a documentation website, or a github repository.

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/slack-qa-bot/)

### Question answering on documents with llama-index

_by [@mudler](https://github.com/mudler)_

Shows how to integrate with [Llama-Index](https://gpt-index.readthedocs.io/en/stable/getting_started/installation.html) to enable question answering on a set of documents.

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/query_data/)

### Question answering on documents with langchain and chroma

_by [@mudler](https://github.com/mudler)_

Shows how to integrate with `Langchain` and `Chroma` to enable question answering on a set of documents.

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/langchain-chroma/)

### Telegram bot

_by [@mudler](https://github.com/mudler)

![Screenshot from 2023-06-09 00-36-26](https://github.com/go-skynet/LocalAI/assets/2420543/e98b4305-fa2d-41cf-9d2f-1bb2d75ca902)

Use LocalAI to power a Telegram bot assistant, with Image generation and audio support!

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/telegram-bot/)

### Template for Runpod.io

_by [@fHachenberg](https://github.com/fHachenberg)_

Allows to run any LocalAI-compatible model as a backend on the servers of https://runpod.io

[Check it out here](https://runpod.io/gsc?template=uv9mtqnrd0&ref=984wlcra)

### Continue

_by [@gruberdev](https://github.com/gruberdev)_

<img src="continue/img/screen.png" width="600" height="200" alt="Screenshot">

Demonstrates how to integrate an open-source copilot alternative that enhances code analysis, completion, and improvements. This approach seamlessly integrates with any LocalAI model, offering a more user-friendly experience.

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/continue/)

### Streamlit bot

_by [@majoshi1](https://github.com/majoshi1)_

![Screenshot](streamlit-bot/streamlit-bot.png)

A chat bot made using `Streamlit` & LocalAI.

[Check it out here](https://github.com/go-skynet/LocalAI/tree/master/examples/streamlit-bot/)

## Want to contribute?

Create an issue, and put `Example: <description>` in the title! We will post your examples here.
LocalAI examples were moved to a dedicated repository: https://github.com/mudler/LocalAI-examples
9 changes: 0 additions & 9 deletions examples/autoGPT/.env.example

This file was deleted.

36 changes: 0 additions & 36 deletions examples/autoGPT/README.md

This file was deleted.

Loading

0 comments on commit d3d37aa

Please sign in to comment.