Skip to content

Commit

Permalink
bump version to 0.4.0
Browse files Browse the repository at this point in the history
  • Loading branch information
yoziru committed Oct 17, 2023
1 parent bbca65b commit a108b0a
Show file tree
Hide file tree
Showing 2 changed files with 15 additions and 8 deletions.
21 changes: 14 additions & 7 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,21 +2,28 @@

## Unreleased

## 0.4.0

- Add 'Enabled' switch for vector services to configuration UI by @sd2k in https://github.com/grafana/grafana-llm-app/pull/79
- Added instructions for developing with example app by @edwardcqian in https://github.com/grafana/grafana-llm-app/pull/86
- Improve health check to return more granular details by @sd2k in https://github.com/grafana/grafana-llm-app/pull/85
- Add support for filtered vector search by @yoziru in https://github.com/grafana/grafana-llm-app/pull/100

## 0.3.0

* Add Go package providing an OpenAI client to use the LLM app from backend Go code
* Add support for Azure OpenAI. The plugin must be configured to use OpenAI and provide a link between OpenAI model names and Azure deployment names
* Return streaming errors as part of the stream, with objects like `{"error": "<error message>"}`
- Add Go package providing an OpenAI client to use the LLM app from backend Go code
- Add support for Azure OpenAI. The plugin must be configured to use OpenAI and provide a link between OpenAI model names and Azure deployment names
- Return streaming errors as part of the stream, with objects like `{"error": "<error message>"}`

## 0.2.1

* Improve health check endpoint to include status of various features
* Change path handling for chat completions streams to put separate requests into separate streams. Requests can pass a UUID as the suffix of the path now, but is backwards compatible with an older version of the frontend code.
- Improve health check endpoint to include status of various features
- Change path handling for chat completions streams to put separate requests into separate streams. Requests can pass a UUID as the suffix of the path now, but is backwards compatible with an older version of the frontend code.

## 0.2.0

* Expose vector search API to perform semantic search against a vector database using a configurable embeddings source
- Expose vector search API to perform semantic search against a vector database using a configurable embeddings source

## 0.1.0

* Support proxying LLM requests from Grafana to OpenAI
- Support proxying LLM requests from Grafana to OpenAI
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "llm",
"version": "0.3.0",
"version": "0.4.0",
"description": "Plugin to easily allow llm based extensions to grafana",
"scripts": {
"build": "webpack -c ./.config/webpack/webpack.config.ts --env production",
Expand Down

0 comments on commit a108b0a

Please sign in to comment.