Fix issue #4912: [Bug]: BedrockException: "The number of toolResult blocks at messages.2.content exceeds the number of toolUse blocks of previous turn.". (#4937)

Co-authored-by: Xingyao Wang <xingyao@all-hands.dev>
Co-authored-by: Graham Neubig <neubig@gmail.com>
Co-authored-by: mamoodi <mamoodiha@gmail.com>
This commit is contained in:
OpenHands 2024-11-12 17:23:11 -05:00 committed by GitHub
parent 59f7093428
commit 207df9dd30
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
3 changed files with 26 additions and 0 deletions

View File

@ -0,0 +1,20 @@
# LiteLLM Proxy
OpenHands supports using the [LiteLLM proxy](https://docs.litellm.ai/docs/proxy/quick_start) to access various LLM providers.
## Configuration
To use LiteLLM proxy with OpenHands, you need to:
1. Set up a LiteLLM proxy server (see [LiteLLM documentation](https://docs.litellm.ai/docs/proxy/quick_start))
2. When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings:
* Enable `Advanced Options`
* `Custom Model` to the prefix `litellm_proxy/` + the model you will be using (e.g. `litellm_proxy/anthropic.claude-3-5-sonnet-20241022-v2:0`)
* `Base URL` to your LiteLLM proxy URL (e.g. `https://your-litellm-proxy.com`)
* `API Key` to your LiteLLM proxy API key
## Supported Models
The supported models depend on your LiteLLM proxy configuration. OpenHands supports any model that your LiteLLM proxy is configured to handle.
Refer to your LiteLLM proxy configuration for the list of available models and their names.

View File

@ -63,6 +63,7 @@ We have a few guides for running OpenHands with specific model providers:
- [Azure](llms/azure-llms)
- [Google](llms/google-llms)
- [Groq](llms/groq)
- [LiteLLM Proxy](llms/litellm-proxy)
- [OpenAI](llms/openai-llms)
- [OpenRouter](llms/openrouter)

View File

@ -76,6 +76,11 @@ const sidebars: SidebarsConfig = {
label: 'Groq',
id: 'usage/llms/groq',
},
{
type: 'doc',
label: 'LiteLLM Proxy',
id: 'usage/llms/litellm-proxy',
},
{
type: 'doc',
label: 'OpenAI',