Skip to content

Commit

Permalink
delete in-code config section in LLM Obs docs (#23169)
Browse files Browse the repository at this point in the history
* delete in-code config section

* Editorial review

* Fix broken shortcode

* Add SDK documentation link

* Tweak wording

* Fix quickstart headers

---------

Co-authored-by: lievan <[email protected]>
Co-authored-by: Jen Gilbert <[email protected]>
  • Loading branch information
3 people authored May 14, 2024
1 parent c751fe3 commit 1b50500
Show file tree
Hide file tree
Showing 3 changed files with 15 additions and 38 deletions.
21 changes: 11 additions & 10 deletions content/en/tracing/llm_observability/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,28 +9,26 @@ LLM Observability is not available in the US1-FED site.

<div class="alert alert-info">LLM Observability is in public beta.</a></div>

Our quickstart docs make use of our Python SDK. For detailed usage, see [the SDK documentation][1]. If your application is written in another language, you can create traces by calling the [API][8] instead.

## Command line quickstart
Our quickstart docs make use of the LLM Observability SDK for Python. For detailed usage, see [the SDK documentation][1]. If your application is written in another language, you can create traces by calling the [API][8] instead.

Use the steps below to run a simple Python script that generates an LLM Observability trace.

### Prerequisites
## Prerequisites

- LLM Observability requires a Datadog API key (see [the instructions for creating an API key][7]).
- The example script below uses OpenAI, but you can modify it to use a different provider. To run the script as written, you need:
- An OpenAI API key stored in your environment as `OPENAI_API_KEY`. To create one, see [Account Setup][4] and [Set up your API key][6] in the OpenAI documentation.
- The OpenAI Python library installed. See [Setting up Python][5] in the OpenAI documentation for instructions.

### 1. Install the SDK
## 1. Install the SDK

Install the following `ddtrace` package hash:

{{< code-block lang="shell" >}}
pip install git+https://github.com/DataDog/dd-trace-py.git@main
{{< /code-block >}}

### 2. Create the script
## 2. Create the script

The Python script below makes a single OpenAI call. Save it as `quickstart.py`.

Expand All @@ -49,17 +47,19 @@ completion = oai_client.chat.completions.create(
)
{{< /code-block >}}

### 3. Run the script
## 3. Run the script

Run the Python script with the following shell command, and a trace of the OpenAI call will be sent to Datadog:
Run the Python script with the following shell command, sending a trace of the OpenAI call to Datadog:

{{< code-block lang="shell" >}}
DD_LLMOBS_ENABLED=1 DD_LLMOBS_APP_NAME=onboarding-quickstart \
DD_API_KEY=<YOUR_DATADOG_API_KEY> DD_SITE={{< region-param key="dd_site" code="true" >}} \
DD_API_KEY=<YOUR_DATADOG_API_KEY> DD_SITE=<YOUR_DATADOG_SITE> \
DD_LLMOBS_NO_APM=1 ddtrace-run python quickstart.py
{{< /code-block >}}

### 4. View the trace
For details on the required environment variables, see [the SDK documentation][9].

## 4. View the trace

A trace of your LLM call should appear in [the Traces tab][3] of LLM Observability in Datadog.

Expand All @@ -74,3 +74,4 @@ The trace you see is composed of a single LLM span. The `ddtrace-run` command au
[6]: https://platform.openai.com/docs/quickstart/step-2-set-up-your-api-key
[7]: /account_management/api-app-keys/#add-an-api-key-or-client-token
[8]: /tracing/llm_observability/api
[9]: /tracing/llm_observability/sdk/#command-line-setup
28 changes: 3 additions & 25 deletions content/en/tracing/llm_observability/sdk.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,36 +28,14 @@ pip install git+https://github.com/DataDog/dd-trace-py.git@main

2. LLM Observability requires a Datadog API key (see [the instructions for creating an API key][7]).

#### In-code setup

Enable LLM Observability through the `LLMOBs.enable()` function, as shown in the example below.

- Use the `integrations` argument to turn on automatic tracing for supported LLM Observability integrations (OpenAI, Bedrock, and LangChain).

- If you do not have Datadog APM set up, set `dd_llmobs_no_apm` to `True`. This configures the `ddtrace` library to not send any data that requires Datadog APM.

- When choosing a value for `ml_app`, see [Application naming guidelines](#application-naming-guidelines) for allowed characters and other constraints.

{{< code-block lang="python" >}}
from ddtrace.llmobs import LLMObs

LLMObs.enable(
ml_app="<YOUR_ML_APP_NAME>",
dd_api_key="<YOUR_DATADOG_API_KEY>",
dd_site="{{< region-param key="dd_site" code="true" >}}",
dd_llmobs_no_apm=True,
integrations=[LLMObs.openai, LLMObs.botocore, LLMObs.langchain],
)
{{< /code-block >}}

#### Command-line setup

You can also enable LLM Observability by running your application using the `ddtrace-run` command and specifying the required environment variables.
Enable LLM Observability by running your application using the `ddtrace-run` command and specifying the required environment variables.

**Note**: `ddtrace-run` automatically turns on all LLM Observability integrations.

{{< code-block lang="shell">}}
DD_SITE={{< region-param key="dd_site" code="true" >}} DD_API_KEY=<YOUR_API_KEY> DD_LLMOBS_ENABLED=1 \
DD_SITE=<YOUR_DATADOG_SITE> DD_API_KEY=<YOUR_API_KEY> DD_LLMOBS_ENABLED=1 \
DD_LLMOBS_APP_NAME=<YOUR_ML_APP_NAME> ddtrace-run <YOUR_APP_STARTUP_COMMAND>
{{< /code-block >}}

Expand All @@ -83,7 +61,7 @@ DD_LLMOBS_APP_NAME=<YOUR_ML_APP_NAME> ddtrace-run <YOUR_APP_STARTUP_COMMAND>

#### Application naming guidelines

Your application name (the value of `ml_app` or `DD_LLMOBS_APP_NAME`) must start with a letter. It may contain the characters listed below:
Your application name (the value of `DD_LLMOBS_APP_NAME`) must start with a letter. It may contain the characters listed below:

- Alphanumerics
- Underscores
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,9 +39,7 @@ Different span kinds also have different parent-child relationships. For details
To trace an LLM application:

1. [Install the LLM Observability SDK][1].
1. Configure the SDK by doing one of the following:
- Add [setup code to your application][17] that provides the name of your application, your Datadog API key, and so on.
- [Start your application using `ddtrace-run`][5], providing the required environment variables.
1. Configure the SDK by providing [the required environment variables][5] in your application startup command.
1. In your code, use the SDK to create spans representing your application's tasks.
- See the span creation example below.
- For additional examples and detailed usage, see the [Quickstart][10] and the [SDK documentation for tracing spans][11].
Expand Down

0 comments on commit 1b50500

Please sign in to comment.