Skip to content

Commit

Permalink
removed CardLists for LLMs and ChatModels (#12307)
Browse files Browse the repository at this point in the history
Problem statement: 
In the `integrations/llms` and `integrations/chat` pages, we have a
sidebar with ToC, and we also have a ToC at the end of the page.
The ToC at the end of the page is not necessary, and it is confusing
when we mix the index page styles; moreover, it requires manual work.
So, I removed ToC at the end of the page (it was discussed with and
approved by @baskaryan)
  • Loading branch information
leo-gan authored Oct 26, 2023
1 parent ebf998a commit 869a49a
Showing 1 changed file with 0 additions and 6 deletions.
6 changes: 0 additions & 6 deletions docs/scripts/model_feat_table.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,6 @@
# LLMs
import DocCardList from "@theme/DocCardList";
## Features (natively supported)
All LLMs implement the Runnable interface, which comes with default implementations of all methods, ie. `ainvoke`, `batch`, `abatch`, `stream`, `astream`. This gives all LLMs basic support for async, streaming and batch, which by default is implemented as below:
- *Async* support defaults to calling the respective sync method in asyncio's default thread pool executor. This lets other async functions in your application make progress while the LLM is being executed, by moving this call to a background thread.
Expand All @@ -45,7 +43,6 @@
{table}
<DocCardList />
"""

CHAT_MODEL_TEMPLATE = """\
Expand All @@ -56,8 +53,6 @@
# Chat models
import DocCardList from "@theme/DocCardList";
## Features (natively supported)
All ChatModels implement the Runnable interface, which comes with default implementations of all methods, ie. `ainvoke`, `batch`, `abatch`, `stream`, `astream`. This gives all ChatModels basic support for async, streaming and batch, which by default is implemented as below:
- *Async* support defaults to calling the respective sync method in asyncio's default thread pool executor. This lets other async functions in your application make progress while the ChatModel is being executed, by moving this call to a background thread.
Expand All @@ -69,7 +64,6 @@
{table}
<DocCardList />
"""


Expand Down

0 comments on commit 869a49a

Please sign in to comment.