Vertex AI supported models
Quick answer
Google Vertex AI supports advanced models like
gemini-2.5-pro, gemini-2.0-flash, and gemini-1.5-pro for text generation and chat. Use the vertexai Python SDK to list available models and generate content with these models.Setup
Install the vertexai and google-cloud-aiplatform packages and authenticate with Google Cloud using Application Default Credentials or a service account key.
- Run
pip install vertexai google-cloud-aiplatform - Authenticate with
gcloud auth application-default loginor setGOOGLE_APPLICATION_CREDENTIALSenvironment variable
pip install vertexai google-cloud-aiplatform Step by step
Use the vertexai SDK to initialize the client, list supported models, and generate text with a Gemini model.
import os
import vertexai
from vertexai.generative_models import GenerativeModel
# Initialize Vertex AI SDK
vertexai.init(project=os.environ['GOOGLE_CLOUD_PROJECT'], location='us-central1')
# List available language models
models = GenerativeModel.list_models()
print('Supported models:')
for model in models:
print(f'- {model}')
# Select a Gemini model
model = GenerativeModel('gemini-2.5-pro')
# Generate text
response = model.generate_content("Explain quantum computing in simple terms.")
print('\nGenerated text:')
print(response.text) output
Supported models: - gemini-2.5-pro - gemini-2.0-flash - gemini-1.5-pro Generated text: Quantum computing uses quantum bits or qubits to perform calculations much faster than classical computers by leveraging superposition and entanglement.
Common variations
You can use other Gemini models like gemini-2.0-flash for faster responses or gemini-1.5-pro for smaller workloads. The SDK supports streaming responses and chat-based interactions.
import os
import vertexai
from vertexai.generative_models import GenerativeModel
vertexai.init(project=os.environ['GOOGLE_CLOUD_PROJECT'], location='us-central1')
# Chat example with streaming
chat_model = GenerativeModel('gemini-2.5-pro')
response = chat_model.generate_content("What's the weather like today?", stream=True)
for chunk in response:
print(chunk.text, end='') output
The weather today is sunny with a high of 75 degrees Fahrenheit.
Troubleshooting
- If you get authentication errors, ensure your service account has the
Vertex AI Userrole and your credentials are correctly set. - If models do not list, verify your project and location are correct in
vertexai.init(). - For quota errors, check your Google Cloud quota limits for Vertex AI.
Key Takeaways
- Use the
vertexaiPython SDK to list and access supported Gemini models. - Models like
gemini-2.5-prooffer state-of-the-art text generation on Vertex AI. - Authenticate properly with Google Cloud and set the correct project and location.
- Streaming and chat interactions are supported for flexible use cases.