Skip to content

Commit

Permalink
Merge branch 'main' of github.com:langchain-ai/langchainjs into 7450
Browse files Browse the repository at this point in the history
  • Loading branch information
jacoblee93 committed Jan 10, 2025
2 parents cc1282e + 8ad8547 commit efd5d85
Show file tree
Hide file tree
Showing 141 changed files with 958 additions and 448 deletions.
4 changes: 2 additions & 2 deletions cookbook/basic_critique_revise.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@
"outputs": [],
"source": [
"Deno.env.set(\"OPENAI_API_KEY\", \"\");\n",
"Deno.env.set(\"LANGCHAIN_API_KEY\", \"\");\n",
"Deno.env.set(\"LANGCHAIN_TRACING_V2\", \"true\");\n",
"Deno.env.set(\"LANGSMITH_API_KEY\", \"\");\n",
"Deno.env.set(\"LANGSMITH_TRACING\", \"true\");\n",
"\n",
"import { z } from \"npm:zod\";\n",
"\n",
Expand Down
3 changes: 1 addition & 2 deletions deno.json
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@
"readline": "https://deno.land/x/[email protected]/mod.ts",
"uuid": "npm:/uuid",
"youtubei.js": "npm:/youtubei.js",
"youtube-transcript": "npm:/youtube-transcript",
"neo4j-driver": "npm:/neo4j-driver",
"axios": "npm:/axios",
"@mendable/firecrawl-js": "npm:/@mendable/firecrawl-js",
Expand All @@ -40,4 +39,4 @@
"@smithy/util-utf8": "npm:/@smithy/util-utf8",
"@aws-sdk/types": "npm:/@aws-sdk/types"
}
}
}
4 changes: 2 additions & 2 deletions docs/core_docs/docs/how_to/agent_executor.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -65,8 +65,8 @@
"After you sign up at the link above, make sure to set your environment variables to start logging traces:\n",
"\n",
"```shell\n",
"export LANGCHAIN_TRACING_V2=\"true\"\n",
"export LANGCHAIN_API_KEY=\"...\"\n",
"export LANGSMITH_TRACING=\"true\"\n",
"export LANGSMITH_API_KEY=\"...\"\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# export LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
58 changes: 58 additions & 0 deletions docs/core_docs/docs/how_to/chat_model_caching.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -98,6 +98,64 @@ import RedisCacheExample from "@examples/cache/chat_models/redis.ts";

<CodeBlock language="typescript">{RedisCacheExample}</CodeBlock>

## Caching with Upstash Redis

LangChain provides an Upstash Redis-based cache. Like the Redis-based cache, this cache is useful if you want to share the cache across multiple processes or servers. The Upstash Redis client uses HTTP and supports edge environments. To use it, you'll need to install the `@upstash/redis` package:

```bash npm2yarn
npm install @upstash/redis
```

You'll also need an [Upstash account](https://docs.upstash.com/redis#create-account) and a [Redis database](https://docs.upstash.com/redis#create-a-database) to connect to. Once you've done that, retrieve your REST URL and REST token.

Then, you can pass a `cache` option when you instantiate the LLM. For example:

import UpstashRedisCacheExample from "@examples/cache/chat_models/upstash_redis.ts";

<CodeBlock language="typescript">{UpstashRedisCacheExample}</CodeBlock>

You can also directly pass in a previously created [@upstash/redis](https://docs.upstash.com/redis/sdks/javascriptsdk/overview) client instance:

import AdvancedUpstashRedisCacheExample from "@examples/cache/chat_models/upstash_redis_advanced.ts";

<CodeBlock language="typescript">{AdvancedUpstashRedisCacheExample}</CodeBlock>

## Caching with Vercel KV

LangChain provides an Vercel KV-based cache. Like the Redis-based cache, this cache is useful if you want to share the cache across multiple processes or servers. The Vercel KV client uses HTTP and supports edge environments. To use it, you'll need to install the `@vercel/kv` package:

```bash npm2yarn
npm install @vercel/kv
```

You'll also need an Vercel account and a [KV database](https://vercel.com/docs/storage/vercel-kv/kv-reference) to connect to. Once you've done that, retrieve your REST URL and REST token.

Then, you can pass a `cache` option when you instantiate the LLM. For example:

import VercelKVCacheExample from "@examples/cache/chat_models/vercel_kv.ts";

<CodeBlock language="typescript">{VercelKVCacheExample}</CodeBlock>

## Caching with Cloudflare KV

:::info
This integration is only supported in Cloudflare Workers.
:::

If you're deploying your project as a Cloudflare Worker, you can use LangChain's Cloudflare KV-powered LLM cache.

For information on how to set up KV in Cloudflare, see [the official documentation](https://developers.cloudflare.com/kv/).

**Note:** If you are using TypeScript, you may need to install types if they aren't already present:

```bash npm2yarn
npm install -S @cloudflare/workers-types
```

import CloudflareExample from "@examples/cache/chat_models/cloudflare_kv.ts";

<CodeBlock language="typescript">{CloudflareExample}</CodeBlock>

## Caching on the File System

:::warning
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/how_to/debugging.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,8 @@ The best way to do this is with [LangSmith](https://smith.langchain.com).
After you sign up at the link above, make sure to set your environment variables to start logging traces:

```shell
export LANGCHAIN_TRACING_V2="true"
export LANGCHAIN_API_KEY="..."
export LANGSMITH_TRACING="true"
export LANGSMITH_API_KEY="..."

# Reduce tracing latency if you are not in a serverless environment
# export LANGCHAIN_CALLBACKS_BACKGROUND=true
Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/how_to/graph_constructing.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@
"\n",
"# Optional, use LangSmith for best-in-class observability\n",
"LANGSMITH_API_KEY=your-api-key\n",
"LANGCHAIN_TRACING_V2=true\n",
"LANGSMITH_TRACING=true\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/how_to/graph_mapping.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@
"\n",
"# Optional, use LangSmith for best-in-class observability\n",
"LANGSMITH_API_KEY=your-api-key\n",
"LANGCHAIN_TRACING_V2=true\n",
"LANGSMITH_TRACING=true\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/how_to/graph_prompting.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@
"\n",
"# Optional, use LangSmith for best-in-class observability\n",
"LANGSMITH_API_KEY=your-api-key\n",
"LANGCHAIN_TRACING_V2=true\n",
"LANGSMITH_TRACING=true\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/how_to/graph_semantic.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@
"\n",
"# Optional, use LangSmith for best-in-class observability\n",
"LANGSMITH_API_KEY=your-api-key\n",
"LANGCHAIN_TRACING_V2=true\n",
"LANGSMITH_TRACING=true\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
16 changes: 16 additions & 0 deletions docs/core_docs/docs/how_to/llm_caching.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -161,6 +161,22 @@ import AdvancedUpstashRedisCacheExample from "@examples/cache/upstash_redis_adva

<CodeBlock language="typescript">{AdvancedUpstashRedisCacheExample}</CodeBlock>

## Caching with Vercel KV

LangChain provides an Vercel KV-based cache. Like the Redis-based cache, this cache is useful if you want to share the cache across multiple processes or servers. The Vercel KV client uses HTTP and supports edge environments. To use it, you'll need to install the `@vercel/kv` package:

```bash npm2yarn
npm install @vercel/kv
```

You'll also need an Vercel account and a [KV database](https://vercel.com/docs/storage/vercel-kv/kv-reference) to connect to. Once you've done that, retrieve your REST URL and REST token.

Then, you can pass a `cache` option when you instantiate the LLM. For example:

import VercelKVCacheExample from "@examples/cache/vercel_kv.ts";

<CodeBlock language="typescript">{VercelKVCacheExample}</CodeBlock>

## Caching with Cloudflare KV

:::info
Expand Down
8 changes: 4 additions & 4 deletions docs/core_docs/docs/how_to/migrate_agent.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -57,10 +57,10 @@
"// process.env.OPENAI_API_KEY = \"...\";\n",
"\n",
"// Optional, add tracing in LangSmith\n",
"// process.env.LANGCHAIN_API_KEY = \"ls...\";\n",
"// process.env.LANGSMITH_API_KEY = \"ls...\";\n",
"// process.env.LANGCHAIN_CALLBACKS_BACKGROUND = \"true\";\n",
"// process.env.LANGCHAIN_TRACING_V2 = \"true\";\n",
"// process.env.LANGCHAIN_PROJECT = \"How to migrate: LangGraphJS\";\n",
"// process.env.LANGSMITH_TRACING = \"true\";\n",
"// process.env.LANGSMITH_PROJECT = \"How to migrate: LangGraphJS\";\n",
"\n",
"// Reduce tracing latency if you are not in a serverless environment\n",
"// process.env.LANGCHAIN_CALLBACKS_BACKGROUND = \"true\";"
Expand Down Expand Up @@ -1337,4 +1337,4 @@
},
"nbformat": 4,
"nbformat_minor": 5
}
}
4 changes: 2 additions & 2 deletions docs/core_docs/docs/how_to/qa_chat_history_how_to.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -63,8 +63,8 @@
"\n",
"\n",
"```bash\n",
"export LANGCHAIN_TRACING_V2=true\n",
"export LANGCHAIN_API_KEY=YOUR_KEY\n",
"export LANGSMITH_TRACING=true\n",
"export LANGSMITH_API_KEY=YOUR_KEY\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# export LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/how_to/qa_citations.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -55,8 +55,8 @@
"\n",
"\n",
"```bash\n",
"export LANGCHAIN_TRACING_V2=true\n",
"export LANGCHAIN_API_KEY=YOUR_KEY\n",
"export LANGSMITH_TRACING=true\n",
"export LANGSMITH_API_KEY=YOUR_KEY\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# export LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/how_to/qa_per_user.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@
"\n",
"# Optional, use LangSmith for best-in-class observability\n",
"LANGSMITH_API_KEY=your-api-key\n",
"LANGCHAIN_TRACING_V2=true\n",
"LANGSMITH_TRACING=true\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/how_to/qa_sources.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -53,8 +53,8 @@
"\n",
"\n",
"```bash\n",
"export LANGCHAIN_TRACING_V2=true\n",
"export LANGCHAIN_API_KEY=YOUR_KEY\n",
"export LANGSMITH_TRACING=true\n",
"export LANGSMITH_API_KEY=YOUR_KEY\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# export LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/how_to/qa_streaming.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -53,8 +53,8 @@
"\n",
"\n",
"```bash\n",
"export LANGCHAIN_TRACING_V2=true\n",
"export LANGCHAIN_API_KEY=YOUR_KEY\n",
"export LANGSMITH_TRACING=true\n",
"export LANGSMITH_API_KEY=YOUR_KEY\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# export LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/how_to/query_few_shot.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
"```\n",
"# Optional, use LangSmith for best-in-class observability\n",
"LANGSMITH_API_KEY=your-api-key\n",
"LANGCHAIN_TRACING_V2=true\n",
"LANGSMITH_TRACING=true\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/how_to/query_high_cardinality.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"```\n",
"# Optional, use LangSmith for best-in-class observability\n",
"LANGSMITH_API_KEY=your-api-key\n",
"LANGCHAIN_TRACING_V2=true\n",
"LANGSMITH_TRACING=true\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/how_to/query_multiple_queries.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
"\n",
"# Optional, use LangSmith for best-in-class observability\n",
"LANGSMITH_API_KEY=your-api-key\n",
"LANGCHAIN_TRACING_V2=true\n",
"LANGSMITH_TRACING=true\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/how_to/query_multiple_retrievers.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
"\n",
"# Optional, use LangSmith for best-in-class observability\n",
"LANGSMITH_API_KEY=your-api-key\n",
"LANGCHAIN_TRACING_V2=true\n",
"LANGSMITH_TRACING=true\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/how_to/query_no_queries.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"\n",
"# Optional, use LangSmith for best-in-class observability\n",
"LANGSMITH_API_KEY=your-api-key\n",
"LANGCHAIN_TRACING_V2=true\n",
"LANGSMITH_TRACING=true\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/how_to/sql_large_db.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,8 @@ npm install langchain @langchain/community @langchain/openai typeorm sqlite3
```bash
export OPENAI_API_KEY="your api key"
# Uncomment the below to use LangSmith. Not required.
# export LANGCHAIN_API_KEY="your api key"
# export LANGCHAIN_TRACING_V2=true
# export LANGSMITH_API_KEY="your api key"
# export LANGSMITH_TRACING=true

# Reduce tracing latency if you are not in a serverless environment
# export LANGCHAIN_CALLBACKS_BACKGROUND=true
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/how_to/sql_prompting.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@ npm install @langchain/community @langchain/openai typeorm sqlite3
```bash
export OPENAI_API_KEY="your api key"
# Uncomment the below to use LangSmith. Not required.
# export LANGCHAIN_API_KEY="your api key"
# export LANGCHAIN_TRACING_V2=true
# export LANGSMITH_API_KEY="your api key"
# export LANGSMITH_TRACING=true

# Reduce tracing latency if you are not in a serverless environment
# export LANGCHAIN_CALLBACKS_BACKGROUND=true
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/how_to/sql_query_checking.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,8 @@ npm install @langchain/community @langchain/openai typeorm sqlite3
```bash
export OPENAI_API_KEY="your api key"
# Uncomment the below to use LangSmith. Not required.
# export LANGCHAIN_API_KEY="your api key"
# export LANGCHAIN_TRACING_V2=true
# export LANGSMITH_API_KEY="your api key"
# export LANGSMITH_TRACING=true

# Reduce tracing latency if you are not in a serverless environment
# export LANGCHAIN_CALLBACKS_BACKGROUND=true
Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/how_to/tools_prompting.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@
"```\n",
"# Optional, use LangSmith for best-in-class observability\n",
"LANGSMITH_API_KEY=your-api-key\n",
"LANGCHAIN_TRACING_V2=true\n",
"LANGSMITH_TRACING=true\n",
"\n",
"# Reduce tracing latency if you are not in a serverless environment\n",
"# LANGCHAIN_CALLBACKS_BACKGROUND=true\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/integrations/chat/anthropic.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -55,8 +55,8 @@
"If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:\n",
"\n",
"```bash\n",
"# export LANGCHAIN_TRACING_V2=\"true\"\n",
"# export LANGCHAIN_API_KEY=\"your-api-key\"\n",
"# export LANGSMITH_TRACING=\"true\"\n",
"# export LANGSMITH_API_KEY=\"your-api-key\"\n",
"```\n",
"\n",
"### Installation\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/integrations/chat/azure.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -66,8 +66,8 @@
"If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:\n",
"\n",
"```bash\n",
"# export LANGCHAIN_TRACING_V2=\"true\"\n",
"# export LANGCHAIN_API_KEY=\"your-api-key\"\n",
"# export LANGSMITH_TRACING=\"true\"\n",
"# export LANGSMITH_API_KEY=\"your-api-key\"\n",
"```\n",
"\n",
"### Installation\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/integrations/chat/bedrock.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -57,8 +57,8 @@
"If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:\n",
"\n",
"```bash\n",
"# export LANGCHAIN_TRACING_V2=\"true\"\n",
"# export LANGCHAIN_API_KEY=\"your-api-key\"\n",
"# export LANGSMITH_TRACING=\"true\"\n",
"# export LANGSMITH_API_KEY=\"your-api-key\"\n",
"```\n",
"\n",
"### Installation\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/integrations/chat/bedrock_converse.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -51,8 +51,8 @@
"If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:\n",
"\n",
"```bash\n",
"# export LANGCHAIN_TRACING_V2=\"true\"\n",
"# export LANGCHAIN_API_KEY=\"your-api-key\"\n",
"# export LANGSMITH_TRACING=\"true\"\n",
"# export LANGSMITH_API_KEY=\"your-api-key\"\n",
"```\n",
"\n",
"### Installation\n",
Expand Down
Loading

0 comments on commit efd5d85

Please sign in to comment.