mirror of
https://github.com/OpenHands/OpenHands.git
synced 2025-12-26 05:48:36 +08:00
1.4 KiB
1.4 KiB
Google Gemini/Vertex
OpenHands uses LiteLLM to make calls to Google's chat models. You can find their documentation on using Google as a provider:
Gemini - Google AI Studio Configs
When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings:
LLM ProvidertoGeminiLLM Modelto the model you will be using. If the model is not in the list, toggleAdvancedoptions, and enter it inCustom Model(e.g. gemini/<model-name> likegemini/gemini-2.0-flash).API Keyto your Gemini API key
VertexAI - Google Cloud Platform Configs
To use Vertex AI through Google Cloud Platform when running OpenHands, you'll need to set the following environment
variables using -e in the docker run command:
GOOGLE_APPLICATION_CREDENTIALS="<json-dump-of-gcp-service-account-json>"
VERTEXAI_PROJECT="<your-gcp-project-id>"
VERTEXAI_LOCATION="<your-gcp-location>"
Then set the following in the OpenHands UI through the Settings:
LLM ProvidertoVertexAILLM Modelto the model you will be using. If the model is not in the list, toggleAdvancedoptions, and enter it inCustom Model(e.g. vertex_ai/<model-name>).