Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added google_search_tool for gemini-2 #677

Merged
merged 1 commit into from
Jan 7, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 11 additions & 6 deletions libs/vertexai/langchain_google_vertexai/chat_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -652,12 +652,7 @@ class ChatVertexAI(_VertexAICommon, BaseChatModel):
"""Google Cloud Vertex AI chat model integration.

Setup:
You must have the langchain-google-vertexai Python package installed
.. code-block:: bash

pip install -U langchain-google-vertexai

And either:
You must either:
- Have credentials configured for your environment (gcloud, workload identity, etc...)
- Store the path to a service account JSON file as the GOOGLE_APPLICATION_CREDENTIALS environment variable

Expand Down Expand Up @@ -803,6 +798,16 @@ class GetPopulation(BaseModel):

See ``ChatVertexAI.bind_tools()`` method for more.

Use Search with Gemini 2:
.. code-block:: python

import google.cloud.aiplatform_v1beta1.types import Tool as VertexTool
llm = ChatVertexAI(model="gemini-2.0-flash-exp")
resp = llm.invoke(
"When is the next total solar eclipse in US?",
tools=[VertexTool(google_search={})],
)

Structured output:
.. code-block:: python

Expand Down
2 changes: 2 additions & 0 deletions libs/vertexai/langchain_google_vertexai/functions_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -283,6 +283,8 @@ def _format_to_gapic_tool(tools: _ToolsType) -> gapic.Tool:
gapic_tool.google_search_retrieval = rt.google_search_retrieval
if "function_declarations" in rt:
gapic_tool.function_declarations.extend(rt.function_declarations)
if "google_search" in rt:
gapic_tool.google_search = rt.google_search
elif isinstance(tool, dict):
# not _ToolDictLike
if not any(
Expand Down
2 changes: 1 addition & 1 deletion libs/vertexai/poetry.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion libs/vertexai/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ license = "MIT"
[tool.poetry.dependencies]
python = ">=3.9,<4.0"
langchain-core = ">=0.3.27,<0.4"
google-cloud-aiplatform = "^1.73.0"
google-cloud-aiplatform = "^1.75.0"
google-cloud-storage = "^2.18.0"
# optional dependencies
anthropic = { extras = ["vertexai"], version = ">=0.35.0,<1", optional = true }
Expand Down
4 changes: 4 additions & 0 deletions libs/vertexai/tests/integration_tests/test_anthropic_cache.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@


@pytest.mark.extended
@pytest.mark.skip(reason="claude-3-5-v2 not enabled")
def test_anthropic_system_cache() -> None:
"""Test chat with system message having cache control."""
project = os.environ["PROJECT_ID"]
Expand All @@ -35,6 +36,7 @@ def test_anthropic_system_cache() -> None:


@pytest.mark.extended
@pytest.mark.skip(reason="claude-3-5-v2 not enabled")
def test_anthropic_mixed_cache() -> None:
"""Test chat with different cache control types."""
project = os.environ["PROJECT_ID"]
Expand Down Expand Up @@ -72,6 +74,7 @@ def test_anthropic_mixed_cache() -> None:


@pytest.mark.extended
@pytest.mark.skip(reason="claude-3-5-v2 not enabled")
def test_anthropic_conversation_cache() -> None:
"""Test chat conversation with cache control."""
project = os.environ["PROJECT_ID"]
Expand Down Expand Up @@ -115,6 +118,7 @@ def test_anthropic_conversation_cache() -> None:


@pytest.mark.extended
@pytest.mark.skip(reason="claude-3-5-v2 not enabled")
def test_anthropic_chat_template_cache() -> None:
"""Test chat template with structured content and cache control."""
project = os.environ["PROJECT_ID"]
Expand Down
1 change: 1 addition & 0 deletions libs/vertexai/tests/integration_tests/test_chat_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -1175,6 +1175,7 @@ def test_multimodal_pdf_input_b64(multimodal_pdf_chain: RunnableSerializable) ->
assert isinstance(response, AIMessage)


@pytest.mark.xfail(reason="logprobs are subject to daily quotas")
@pytest.mark.release
def test_logprobs() -> None:
llm = ChatVertexAI(model="gemini-1.5-flash", logprobs=2)
Expand Down
Loading