Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Build Doc CI issues #3783

Merged
merged 13 commits into from
Oct 31, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
103 changes: 0 additions & 103 deletions docs/cloud/azureai/tracing/index.md

This file was deleted.

32 changes: 0 additions & 32 deletions docs/cloud/azureai/tracing/run_tracking.md

This file was deleted.

2 changes: 1 addition & 1 deletion docs/concepts/concept-connections.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Prompt flow provides a variety of pre-built connections, including Azure OpenAI,
| [OpenAI](https://openai.com/) | LLM or Python |
| [Cognitive Search](https://azure.microsoft.com/products/search) | Vector DB Lookup or Python |
| [Serp](https://serpapi.com/) | Serp API or Python |
| [Serverless](https://learn.microsoft.com/azure/ai-studio/concepts/deployments-overview#deploy-models-as-serverless-apis) | LLM or Python |
| [Serverless](https://learn.microsoft.com/azure/ai-studio/concepts/deployments-overview) | LLM or Python |
| Custom | Python |

By leveraging connections in prompt flow, you can easily establish and manage connections to external APIs and data sources, facilitating efficient data exchange and interaction within their AI applications.
Expand Down
6 changes: 3 additions & 3 deletions docs/reference/tools-reference/llm-tool.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# LLM
# LLM

## Introduction
Prompt flow LLM tool enables you to leverage widely used large language models like [OpenAI](https://platform.openai.com/), [Azure OpenAI (AOAI)](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/overview), and models in [Azure AI Studio model catalog](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/model-catalog) for natural language processing.
Prompt flow LLM tool enables you to leverage widely used large language models like [OpenAI](https://platform.openai.com/), [Azure OpenAI (AOAI)](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/overview), and models in [Azure AI Studio model catalog](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/model-catalog) for natural language processing.
> [!NOTE]
> The previous version of the LLM tool is now being deprecated. Please upgrade to latest [promptflow-tools](https://pypi.org/project/promptflow-tools/) package to consume new llm tools.

Expand All @@ -25,7 +25,7 @@ Create OpenAI resources, Azure OpenAI resources or MaaS deployment with the LLM

- **MaaS deployment**

Create MaaS deployment for models in Azure AI Studio model catalog with [instruction](https://learn.microsoft.com/azure/ai-studio/concepts/deployments-overview#deploy-models-as-serverless-apis)
Create MaaS deployment for models in Azure AI Studio model catalog with [instruction](https://learn.microsoft.com/azure/ai-studio/concepts/deployments-overview)

You can create serverless connection to use this MaaS deployment.

Expand Down
4 changes: 2 additions & 2 deletions examples/tutorials/run-flow-with-pipeline/pipeline.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"When using the `load_component` function and the flow YAML specification, your flow is automatically transformed into a __[parallel component](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-parallel-job-in-pipeline?view=azureml-api-2&tabs=cliv2#why-are-parallel-jobs-needed)__. This parallel component is designed for large-scale, offline, parallelized processing with efficiency and resilience. Here are some key features of this auto-converted component:\n",
"When using the `load_component` function and the flow YAML specification, your flow is automatically transformed into a __[parallel component](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-parallel-job-in-pipeline?view=azureml-api-2&tabs=cliv2)__. This parallel component is designed for large-scale, offline, parallelized processing with efficiency and resilience. Here are some key features of this auto-converted component:\n",
"\n",
" - Pre-defined input and output ports:\n",
"\n",
Expand Down Expand Up @@ -176,7 +176,7 @@
"## 3.1 Declare input and output\n",
"To supply your pipeline with data, you need to declare an input using the `path`, `type`, and `mode` properties. Please note: `mount` is the default and suggested mode for your file or folder data input.\n",
"\n",
"Declaring the pipeline output is optional. However, if you require a customized output path in the cloud, you can follow the example below to set the path on the datastore. For more detailed information on valid path values, refer to this documentation - [manage pipeline inputs outputs](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-manage-inputs-outputs-pipeline?view=azureml-api-2&tabs=cli#path-and-mode-for-data-inputsoutputs)."
"Declaring the pipeline output is optional. However, if you require a customized output path in the cloud, you can follow the example below to set the path on the datastore. For more detailed information on valid path values, refer to this documentation - [manage pipeline inputs outputs](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-manage-inputs-outputs-pipeline?view=azureml-api-2&tabs=cli)."
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,6 @@
"AutoGen offers conversable agents powered by LLM, tool or human, which can be used to perform tasks collectively via automated chat. This framework allows tool use and human participation through multi-agent conversation.\n",
"Please find documentation about this feature [here](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat).\n",
"\n",
"This notebook is modified based on [autogen agent chat example](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_groupchat.ipynb). \n",
"\n",
"**Learning Objectives** - Upon completing this tutorial, you should be able to:\n",
"\n",
"- Trace LLM (OpenAI) Calls and visualize the trace of your application.\n",
Expand Down Expand Up @@ -45,7 +43,7 @@
"\n",
"You can create the config file named `OAI_CONFIG_LIST.json` from example file: `OAI_CONFIG_LIST.json.example`.\n",
"\n",
"Below code use the [`config_list_from_json`](https://microsoft.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file. \n"
"Below code use the [`config_list_from_json`](https://microsoft.github.io/autogen/0.2/docs/reference/oai/openai_utils/#config_list_from_json) function loads a list of configurations from an environment variable or a json file. \n"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
"The tracing capability provided by Prompt flow is built on top of [OpenTelemetry](https://opentelemetry.io/) that gives you complete observability over your LLM applications. \n",
"And there is already a rich set of OpenTelemetry [instrumentation packages](https://opentelemetry.io/ecosystem/registry/?language=python&component=instrumentation) available in OpenTelemetry Eco System. \n",
"\n",
"In this example we will demo how to use [opentelemetry-instrumentation-langchain](https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-langchain) package provided by [Traceloop](https://www.traceloop.com/) to instrument [LangChain](https://python.langchain.com/docs/get_started/quickstart) apps.\n",
"In this example we will demo how to use [opentelemetry-instrumentation-langchain](https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-langchain) package provided by [Traceloop](https://www.traceloop.com/) to instrument [LangChain](https://python.langchain.com/docs/tutorials/) apps.\n",
"\n",
"\n",
"**Learning Objectives** - Upon completing this tutorial, you should be able to:\n",
Expand Down
Loading