mirror of
https://github.com/OpenHands/OpenHands.git
synced 2026-03-22 13:47:19 +08:00
Add OpenRouter provider docs (#3986)
This commit is contained in:
@@ -1,6 +1,6 @@
|
||||
# Azure
|
||||
|
||||
OpenHands uses LiteLLM for completion calls. You can find their documentation on Azure [here](https://docs.litellm.ai/docs/providers/azure).
|
||||
OpenHands uses LiteLLM to make calls to Azure's chat models. You can find their documentation on using Azure as a provider [here](https://docs.litellm.ai/docs/providers/azure).
|
||||
|
||||
## Azure OpenAI Configuration
|
||||
|
||||
@@ -27,7 +27,7 @@ You will need your ChatGPT deployment name which can be found on the deployments
|
||||
|
||||
* Enable `Advanced Options`
|
||||
* `Custom Model` to azure/<deployment-name>
|
||||
* `Base URL` to your Azure API Base URL (Example: `https://example-endpoint.openai.azure.com`)
|
||||
* `Base URL` to your Azure API Base URL (e.g. `https://example-endpoint.openai.azure.com`)
|
||||
* `API Key` to your Azure API key
|
||||
|
||||
## Embeddings
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Google Gemini/Vertex
|
||||
|
||||
OpenHands uses LiteLLM for completion calls. The following resources are relevant for using OpenHands with Google's LLMs:
|
||||
OpenHands uses LiteLLM to make calls to Google's chat models. You can find their documentation on using Google as a provider:
|
||||
|
||||
- [Gemini - Google AI Studio](https://docs.litellm.ai/docs/providers/gemini)
|
||||
- [VertexAI - Google Cloud Platform](https://docs.litellm.ai/docs/providers/vertex)
|
||||
@@ -10,7 +10,7 @@ OpenHands uses LiteLLM for completion calls. The following resources are relevan
|
||||
When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings:
|
||||
* `LLM Provider` to `Gemini`
|
||||
* `LLM Model` to the model you will be using.
|
||||
If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (i.e. gemini/<model-name>).
|
||||
If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (e.g. gemini/<model-name> like `gemini/gemini-1.5-pro`).
|
||||
* `API Key` to your Gemini API key
|
||||
|
||||
## VertexAI - Google Cloud Platform Configs
|
||||
@@ -27,4 +27,4 @@ VERTEXAI_LOCATION="<your-gcp-location>"
|
||||
Then set the following in the OpenHands UI through the Settings:
|
||||
* `LLM Provider` to `VertexAI`
|
||||
* `LLM Model` to the model you will be using.
|
||||
If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (i.e. vertex_ai/<model-name>).
|
||||
If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (e.g. vertex_ai/<model-name>).
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Groq
|
||||
|
||||
OpenHands uses LiteLLM to make calls to chat models on Groq. You can find their full documentation on using Groq as provider [here](https://docs.litellm.ai/docs/providers/groq).
|
||||
OpenHands uses LiteLLM to make calls to chat models on Groq. You can find their documentation on using Groq as a provider [here](https://docs.litellm.ai/docs/providers/groq).
|
||||
|
||||
## Configuration
|
||||
|
||||
@@ -8,7 +8,7 @@ When running OpenHands, you'll need to set the following in the OpenHands UI thr
|
||||
* `LLM Provider` to `Groq`
|
||||
* `LLM Model` to the model you will be using. [Visit here to see the list of
|
||||
models that Groq hosts](https://console.groq.com/docs/models). If the model is not in the list, toggle
|
||||
`Advanced Options`, and enter it in `Custom Model` (i.e. groq/<model-name>)
|
||||
`Advanced Options`, and enter it in `Custom Model` (e.g. groq/<model-name> like `groq/llama3-70b-8192`)
|
||||
* `API key` to your Groq API key. To find or create your Groq API Key, [see here](https://console.groq.com/keys)
|
||||
|
||||
|
||||
@@ -18,6 +18,6 @@ models that Groq hosts](https://console.groq.com/docs/models). If the model is n
|
||||
The Groq endpoint for chat completion is [mostly OpenAI-compatible](https://console.groq.com/docs/openai). Therefore, you can access Groq models as you
|
||||
would access any OpenAI-compatible endpoint. You can set the following in the OpenHands UI through the Settings:
|
||||
* Enable `Advanced Options`
|
||||
* `Custom Model` to the prefix `openai/` + the model you will be using (Example: `openai/llama3-8b-8192`)
|
||||
* `Custom Model` to the prefix `openai/` + the model you will be using (e.g. `openai/llama3-70b-8192`)
|
||||
* `Base URL` to `https://api.groq.com/openai/v1`
|
||||
* `API Key` to your Groq API key
|
||||
|
||||
@@ -55,6 +55,7 @@ We have a few guides for running OpenHands with specific model providers:
|
||||
* [Google](llms/google-llms)
|
||||
* [Groq](llms/groq)
|
||||
* [OpenAI](llms/openai-llms)
|
||||
* [OpenRouter](llms/openrouter)
|
||||
|
||||
### API retries and rate limits
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# OpenAI
|
||||
|
||||
OpenHands uses LiteLLM to make calls to OpenAI's chat models. You can find their full documentation on OpenAI chat calls [here](https://docs.litellm.ai/docs/providers/openai).
|
||||
OpenHands uses LiteLLM to make calls to OpenAI's chat models. You can find their documentation on using OpenAI as a provider [here](https://docs.litellm.ai/docs/providers/openai).
|
||||
|
||||
## Configuration
|
||||
|
||||
@@ -8,7 +8,7 @@ When running OpenHands, you'll need to set the following in the OpenHands UI thr
|
||||
* `LLM Provider` to `OpenAI`
|
||||
* `LLM Model` to the model you will be using.
|
||||
[Visit here to see a full list of OpenAI models that LiteLLM supports.](https://docs.litellm.ai/docs/providers/openai#openai-chat-completion-models)
|
||||
If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (i.e. openai/<model-name>).
|
||||
If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (e.g. openai/<model-name> like `openai/gpt-4o`).
|
||||
* `API Key` to your OpenAI API key. To find or create your OpenAI Project API Key, [see here](https://platform.openai.com/api-keys).
|
||||
|
||||
## Using OpenAI-Compatible Endpoints
|
||||
@@ -19,6 +19,6 @@ Just as for OpenAI Chat completions, we use LiteLLM for OpenAI-compatible endpoi
|
||||
|
||||
If you're using an OpenAI proxy, you'll need to set the following in the OpenHands UI through the Settings:
|
||||
* Enable `Advanced Options`
|
||||
* `Custom Model` to openai/<model-name> (i.e.: `openai/gpt-4o` or openai/<proxy-prefix>/<model-name>)
|
||||
* `Custom Model` to openai/<model-name> (e.g. `openai/gpt-4o` or openai/<proxy-prefix>/<model-name>)
|
||||
* `Base URL` to the URL of your OpenAI proxy
|
||||
* `API Key` to your OpenAI API key
|
||||
|
||||
12
docs/modules/usage/llms/openrouter.md
Normal file
12
docs/modules/usage/llms/openrouter.md
Normal file
@@ -0,0 +1,12 @@
|
||||
# OpenRouter
|
||||
|
||||
OpenHands uses LiteLLM to make calls to chat models on OpenRouter. You can find their documentation on using OpenRouter as a provider [here](https://docs.litellm.ai/docs/providers/openrouter).
|
||||
|
||||
## Configuration
|
||||
|
||||
When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings:
|
||||
* `LLM Provider` to `OpenRouter`
|
||||
* `LLM Model` to the model you will be using.
|
||||
[Visit here to see a full list of OpenRouter models](https://openrouter.ai/models).
|
||||
If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (e.g. openrouter/<model-name> like `openrouter/anthropic/claude-3.5-sonnet`).
|
||||
* `API Key` to your OpenRouter API key.
|
||||
@@ -21,11 +21,6 @@ const sidebars: SidebarsConfig = {
|
||||
type: 'category',
|
||||
label: 'Providers',
|
||||
items: [
|
||||
{
|
||||
type: 'doc',
|
||||
label: 'OpenAI',
|
||||
id: 'usage/llms/openai-llms',
|
||||
},
|
||||
{
|
||||
type: 'doc',
|
||||
label: 'Azure',
|
||||
@@ -40,7 +35,17 @@ const sidebars: SidebarsConfig = {
|
||||
type: 'doc',
|
||||
label: 'Groq',
|
||||
id: 'usage/llms/groq',
|
||||
}
|
||||
},
|
||||
{
|
||||
type: 'doc',
|
||||
label: 'OpenAI',
|
||||
id: 'usage/llms/openai-llms',
|
||||
},
|
||||
{
|
||||
type: 'doc',
|
||||
label: 'OpenRouter',
|
||||
id: 'usage/llms/openrouter',
|
||||
},
|
||||
],
|
||||
},
|
||||
],
|
||||
|
||||
Reference in New Issue
Block a user