Skip to content

Commit

Permalink
Merge pull request #18 from Azure-Samples/readme
Browse files Browse the repository at this point in the history
Readme updates
  • Loading branch information
pamelafox authored Oct 23, 2024
2 parents d6f4565 + 6b830f3 commit 8716728
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 37 deletions.
24 changes: 6 additions & 18 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,10 +24,10 @@ since the local app needs credentials for Azure OpenAI to work properly.

## Features

* A Python [Quart](https://quart.palletsprojects.com/en/latest/) that uses the [openai](https://pypi.org/project/openai/) package to generate responses to user messages.
* A Python [Quart](https://quart.palletsprojects.com/en/latest/) that uses the [openai](https://pypi.org/project/openai/) package to generate responses to user messages with uploaded image files.
* A basic HTML/JS frontend that streams responses from the backend using [JSON Lines](http://jsonlines.org/) over a [ReadableStream](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream).
* [Bicep files](https://docs.microsoft.com/azure/azure-resource-manager/bicep/) for provisioning Azure resources, including Azure OpenAI, Azure Container Apps, Azure Container Registry, Azure Log Analytics, and RBAC roles.
* Support for using [local LLMs](/docs/local_ollama.md) during development.
* Support for using [GitHub models](https://github.com/marketplace/models) during development.

![Screenshot of the chat app](docs/screenshot_chatimage.png)

Expand Down Expand Up @@ -146,19 +146,9 @@ azd pipeline config

In order to run this app, you need to either have an Azure OpenAI account deployed (from the [deploying steps](#deploying)) or use a model from [GitHub models](https://github.com/marketplace/models).

1. Copy `.env.sample.azure` into `.env`:
1. If you already deployed the app using `azd up`, then a `.env` file was created with the necessary environment variables, and you can skip to step 3.

```shell
cp .env.sample .env
```

2. For use with Azure OpenAI, run this command to get the value of `AZURE_OPENAI_ENDPOINT` from your deployed resource group and paste it in the `.env` file:

```shell
azd env get-value AZURE_OPENAI_ENDPOINT
```

3. For use with GitHub models, change `OPENAI_HOST` to "github" in the `.env` file.
2. To use the app with GitHub models, change `OPENAI_HOST` to "github" in the `.env` file.

You'll need a `GITHUB_TOKEN` environment variable that stores a GitHub personal access token.
If you're running this inside a GitHub Codespace, the token will be automatically available.
Expand All @@ -168,15 +158,13 @@ In order to run this app, you need to either have an Azure OpenAI account deploy
export GITHUB_TOKEN="<your-github-token-goes-here>"
```

4. Start the development server:
3. Start the development server:

```shell
python -m quart --app src.quartapp run --port 50505 --reload
```

This will start the app on port 50505, and you can access it at `http://localhost:50505`.

To save costs during development, you may point the app at a [local LLM server](/docs/local_ollama.md).
This will start the app on port 50505, and you can access it at `http://localhost:50505`.

## Guidance

Expand Down
19 changes: 0 additions & 19 deletions docs/local_docker.md

This file was deleted.

0 comments on commit 8716728

Please sign in to comment.