Skip to content

Commit

Permalink
anthropic observability
Browse files Browse the repository at this point in the history
  • Loading branch information
ivanagas committed Jan 17, 2025
1 parent 017509c commit ed83c81
Show file tree
Hide file tree
Showing 3 changed files with 67 additions and 5 deletions.
49 changes: 49 additions & 0 deletions contents/docs/ai-engineering/_snippets/anthropic.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
Start by installing the Anthropic Python SDK:

```bash
pip install anthropic
```

In the spot where you initialize the Anthropic SDK, import PostHog and our Anthropic wrapper, initialize PostHog with your project API key and host (from [your project settings](https://us.posthog.com/settings/project)), and pass it to our Anthropic wrapper.

```python
from posthog.ai.anthropic import Anthropic
import posthog

posthog.project_api_key = "<ph_project_api_key>"
posthog.host = "<ph_client_api_host>"

client = Anthropic(
api_key="sk-ant-api...", # Replace with your Anthropic API key
posthog_client=posthog
)
```

> **Note:** This also works with the `AsyncAnthropic` client as well as `AnthropicBedrock`, `AnthropicVertex`, and the async versions of those.
Now, when you use the Anthropic SDK, it automatically captures many properties into PostHog including `$ai_input`, `$ai_input_tokens`, `$ai_latency`, `$ai_model`, `$ai_model_parameters`, `$ai_output_choices`, and `$ai_output_tokens`.

You can also capture additional properties like `posthog_distinct_id`, `posthog_trace_id`, `posthog_properties`, `posthog_groups`, and `posthog_privacy_mode`.

```python
response = client.messages.create(
model="claude-3-opus-20240229",
messages=[
{
"role": "user",
"content": "Tell me a fun fact about hedgehogs"
}
],
posthog_distinct_id="user_123", # optional
posthog_trace_id="trace_123", # optional
posthog_properties={"conversation_id": "abc123", "paid": True}, # optional
posthog_groups={"company": "company_id_in_your_db"}, # optional
posthog_privacy_mode=False # optional
)

print(response.content[0].text)
```

> **Notes:**
> - This also works when message streams are used (e.g. `stream=True` or `client.messages.stream(...)`).
> - If you want to capture LLM events anonymously, **don't** pass a distinct ID to the request. See our docs on [anonymous vs identified events](/docs/data/anonymous-vs-identified-events) to learn more.
6 changes: 3 additions & 3 deletions contents/docs/ai-engineering/_snippets/openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ client = OpenAI(

> **Note:** This also works with the `AsyncOpenAI` client.
Now, when you use the OpenAI SDK, it automatically captures many properties into PostHog including `$ai_input`, `$ai_input_tokens`, `$ai_latency`, `$ai_model`, `$ai_model_parameters`, `$ai_output`, and `$ai_output_tokens`.
Now, when you use the OpenAI SDK, it automatically captures many properties into PostHog including `$ai_input`, `$ai_input_tokens`, `$ai_latency`, `$ai_model`, `$ai_model_parameters`, `$ai_output_choices`, and `$ai_output_tokens`.

You can also capture additional properties like `posthog_distinct_id`, `posthog_trace_id`, `posthog_properties`, `posthog_groups`, and `posthog_privacy_mode`.

Expand All @@ -33,8 +33,8 @@ response = client.chat.completions.create(
],
posthog_distinct_id="user_123", # optional
posthog_trace_id="trace_123", # optional
posthog_properties={"conversation_id": "abc123", "paid": True} # optional
posthog_groups={"company": "company_id_in_your_db"} # optional
posthog_properties={"conversation_id": "abc123", "paid": True}, # optional
posthog_groups={"company": "company_id_in_your_db"}, # optional
posthog_privacy_mode=False # optional
)

Expand Down
17 changes: 15 additions & 2 deletions contents/docs/ai-engineering/observability.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -32,18 +32,23 @@ The rest of the setup depends on the LLM platform you're using. These SDKs _do n

import Tab from "components/Tab"
import OpenAIInstall from "./_snippets/openai.mdx"
import AnthropicInstall from "./_snippets/anthropic.mdx"
import LangChainInstall from "./_snippets/langchain.mdx"

<!-- prettier-ignore -->
<Tab.Group tabs={['OpenAI', 'Langchain']}>
<Tab.Group tabs={['OpenAI', 'Anthropic', 'Langchain']}>
<Tab.List>
<Tab>OpenAI</Tab>
<Tab>Anthropic</Tab>
<Tab>LangChain</Tab>
</Tab.List>
<Tab.Panels>
<Tab.Panel>
<OpenAIInstall />
</Tab.Panel>
<Tab.Panel>
<AnthropicInstall />
</Tab.Panel>
<Tab.Panel>
<LangChainInstall />
</Tab.Panel>
Expand All @@ -52,7 +57,7 @@ import LangChainInstall from "./_snippets/langchain.mdx"

## Privacy mode

To avoid storing potentially sensitive prompt and completion data, you can enable privacy mode. This excludes the `$ai_input` and `$ai_output` properties from being captured.
To avoid storing potentially sensitive prompt and completion data, you can enable privacy mode. This excludes the `$ai_input` and `$ai_output_choices` properties from being captured.

This can be done either by setting the `privacy_mode` config option in the Python SDK like this:

Expand All @@ -75,6 +80,14 @@ client.chat.completions.create(
)
```

```python file=Anthropic
response = client.messages.create(
model="claude-3-opus-20240229",
messages=[...],
posthog_privacy_mode=True
)
```

```python file=LangChain
callback_handler = PosthogCallbackHandler(
client,
Expand Down

0 comments on commit ed83c81

Please sign in to comment.