diff --git a/docs/docs/tutorials/llm_chain.ipynb b/docs/docs/tutorials/llm_chain.ipynb index db837dad56b97..b2199b6acc425 100644 --- a/docs/docs/tutorials/llm_chain.ipynb +++ b/docs/docs/tutorials/llm_chain.ipynb @@ -23,7 +23,7 @@ "\n", "- Using [language models](/docs/concepts/chat_models)\n", "\n", - "- Using [PromptTemplates](/docs/concepts/prompt_templates) and [OutputParsers](/docs/concepts/output_parsers)\n", + "- Using [Prompt Templates](/docs/concepts/prompt_templates) and [Output Parsers](/docs/concepts/output_parsers)\n", "\n", "- Using [LangChain Expression Language (LCEL)](/docs/concepts/lcel) to chain components together\n", "\n", @@ -164,7 +164,7 @@ "id": "32bd03ed", "metadata": {}, "source": [ - "## OutputParsers\n", + "## Output Parsers\n", "\n", "Notice that the response from the model is an `AIMessage`. This contains a string response along with other metadata about the response. Oftentimes we may just want to work with the string response. We can parse out just this response by using a simple output parser.\n", "\n", @@ -530,8 +530,8 @@ "\n", "### Client\n", "\n", - "Now let's set up a client for programmatically interacting with our service. We can easily do this with the [langserve.RemoteRunnable](/docs/langserve/#client).\n", - "Using this, we can interact with the served chain as if it were running client-side." + "Now let's set up a client for programmatically interacting with our service. We can easily do this with the [RemoteRunnable](/docs/langserve/#client) class from LangServe.\n", + "Using this, we can interact with the served chain as if we were running it from the client-side." ] }, {