Skip to main content
This page covers all LangChain integrations with Google Gemini, Google Cloud, and other Google products (such as Google Maps, YouTube, and more).
Unified SDK & Package ConsolidationAs of langchain-google-genai 4.0.0, this package uses the consolidated google-genai SDK and now supports both the Gemini Developer API and Vertex AI backends.The langchain-google-vertexai package remains supported for Vertex AI platform-specific features (Model Garden, Vector Search, evaluation services, etc.).Read the full announcement and migration guide.
Not sure which package to use?
Access Google Gemini models via the Gemini Developer API or Vertex AI. The backend is selected automatically based on your configuration.
  • Gemini Developer API: Quick setup with API key, ideal for individual developers and rapid prototyping
  • Vertex AI: Enterprise features with Google Cloud integration (requires GCP project)
Use the langchain-google-genai package for chat models, LLMs, and embeddings.See integrations.
Access Vertex AI platform-specific services beyond Gemini models: Model Garden (Llama, Mistral, Anthropic), evaluation services, and specialized vision models.Use the langchain-google-vertexai package for platform services and specific packages (e.g., langchain-google-community, langchain-google-cloud-sql-pg) for other cloud services like databases and storage.See integrations.
See Google’s guide on migrating from the Gemini API to Vertex AI for more details on the differences.
Integration packages for Gemini models and the Vertex AI platform are maintained in the langchain-google repository.You can find a host of LangChain integrations with other Google APIs and services in the langchain-google-community package (listed on this page) and the googleapis GitHub organization.

Google Generative AI

Access Google Gemini models via the Gemini Developer API or Vertex AI using the unified langchain-google-genai package.
Package consolidationCertain langchain-google-vertexai classes for Gemini models are being deprecated in favor of the unified langchain-google-genai package. Please migrate to the new classes.Read the full announcement and migration guide.

Chat models

LLMs

Embedding models


Google Cloud

Access Vertex AI platform-specific services including Model Garden (Llama, Mistral, Anthropic), Vector Search, evaluation services, and specialized vision models.

Chat models

For Gemini models, use ChatGoogleGenerativeAI from langchain-google-genai instead of ChatVertexAI. It supports both Gemini Developer API and Vertex AI backends.The classes below focus on Vertex AI platform services that are not available in the consolidated SDK.Read the full announcement and migration guide.
Llama on Vertex AI Model Garden
from langchain_google_vertexai.model_garden_maas.llama import VertexModelGardenLlama
Mistral on Vertex AI Model Garden
from langchain_google_vertexai.model_garden_maas.mistral import VertexModelGardenMistral
Local Gemma model loaded from HuggingFace.
from langchain_google_vertexai.gemma import GemmaChatLocalHF
Local Gemma model loaded from Kaggle.
from langchain_google_vertexai.gemma import GemmaChatLocalKaggle
Gemma on Vertex AI Model Garden
from langchain_google_vertexai.gemma import GemmaChatVertexAIModelGarden
Implementation of the Image Captioning model as a chat.
from langchain_google_vertexai.vision_models import VertexAIImageCaptioningChat
Given an image and a prompt, edit the image. Currently only supports mask-free editing.
from langchain_google_vertexai.vision_models import VertexAIImageEditorChat
Generates an image from a prompt.
from langchain_google_vertexai.vision_models import VertexAIImageGeneratorChat
Chat implementation of a visual QnA model.
from langchain_google_vertexai.vision_models import VertexAIVisualQnAChat

LLMs

(legacy) string-in, string-out LLM interface. Gemma:
Local Gemma model loaded from HuggingFace.
from langchain_google_vertexai.gemma import GemmaLocalHF
Local Gemma model loaded from Kaggle.
from langchain_google_vertexai.gemma import GemmaLocalKaggle
from langchain_google_vertexai.gemma import GemmaVertexAIModelGarden
Implementation of the Image Captioning model as an LLM.
from langchain_google_vertexai.vision_models import VertexAIImageCaptioning

Embedding models

Document loaders

Load documents from various Google Cloud sources.

Cloud Vision loader

Load data using Google Cloud Vision API.
from langchain_google_community.vision import CloudVisionLoader

Document transformers

Transform documents using Google Cloud services.

Vector stores

Store and search vectors using Google Cloud databases and Vertex AI Vector Search.

Retrievers

Retrieve information using Google Cloud services.
Other retrievers
from langchain_google_community import VertexAIMultiTurnSearchRetriever
from langchain_google_community import VertexAISearchRetriever
from langchain_google_community import VertexAISearchSummaryTool

Tools

Integrate agents with various Google Cloud services.

Callbacks

Track LLM/Chat model usage.
Callback Handler that tracks VertexAI usage info.
from langchain_google_vertexai.callbacks import VertexAICallbackHandler

Evaluators

Evaluate model outputs using Vertex AI.
Pair-wise evaluation using Vertex AI models.
from langchain_google_vertexai.evaluators.evaluation import VertexPairWiseStringEvaluator
Evaluate a single prediction string using Vertex AI models.
from langchain_google_vertexai.evaluators.evaluation import VertexStringEvaluator

Other Google products

Integrations with various Google services beyond the core Cloud Platform.

Document loaders

Vector stores

Retrievers

Tools

MCP

Toolkits

Collections of tools for specific Google services.

Chat loaders


3rd party integrations

Access Google services via unofficial third-party APIs.

YouTube


Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.