Make Claude Sonnet 3.5 the recommended model and update docs accordingly (#4151)

This commit is contained in:
mamoodi
2024-10-01 16:32:39 -04:00
committed by GitHub
parent 53a015f718
commit 04643d6f3c
7 changed files with 10 additions and 11 deletions

View File

@@ -6,9 +6,9 @@ sidebar_position: 2
## System Requirements
* Docker version 26.0.0+ or Docker Desktop 4.31.0+
* You must be using Linux or Mac OS
* If you are on Windows, you must use [WSL](https://learn.microsoft.com/en-us/windows/wsl/install)
* Docker version 26.0.0+ or Docker Desktop 4.31.0+.
* You must be using Linux or Mac OS.
* If you are on Windows, you must use [WSL](https://learn.microsoft.com/en-us/windows/wsl/install).
## Installation

View File

@@ -8,7 +8,7 @@ This mode is different from the [headless mode](headless-mode), which is non-int
To start an interactive OpenHands session via the command line, follow these steps:
1. Ensure you have followed the [Development setup instructions](https://github.com/All-Hands-AI/OpenHands/blob/main/Development.md)
1. Ensure you have followed the [Development setup instructions](https://github.com/All-Hands-AI/OpenHands/blob/main/Development.md).
2. Run the following command:

View File

@@ -9,8 +9,8 @@ as python and Node.js but your use case may need additional software installed b
There are two ways you can do so:
1. Use an existing image from docker hub
2. Creating your own custom docker image and using it
1. Use an existing image from docker hub.
2. Creating your own custom docker image and using it.
If you want to take the first approach, you can skip the `Create Your Docker Image` section.

View File

@@ -8,8 +8,8 @@ When running OpenHands, you'll need to set the following in the OpenHands UI thr
* `LLM Provider` to `Groq`
* `LLM Model` to the model you will be using. [Visit here to see the list of
models that Groq hosts](https://console.groq.com/docs/models). If the model is not in the list, toggle
`Advanced Options`, and enter it in `Custom Model` (e.g. groq/<model-name> like `groq/llama3-70b-8192`)
* `API key` to your Groq API key. To find or create your Groq API Key, [see here](https://console.groq.com/keys)
`Advanced Options`, and enter it in `Custom Model` (e.g. groq/<model-name> like `groq/llama3-70b-8192`).
* `API key` to your Groq API key. To find or create your Groq API Key, [see here](https://console.groq.com/keys).

View File

@@ -7,11 +7,10 @@ sidebar_position: 3
OpenHands can connect to any LLM supported by LiteLLM. However, it requires a powerful model to work.
The following are verified by the community to work with OpenHands:
* claude-3-5-sonnet
* claude-3-5-sonnet (recommended)
* gemini-1.5-pro / gemini-1.5-flash
* gpt-4 / gpt-4o
* llama-3.1-405b / hermes-3-llama-3.1-405b
* wizardlm-2-8x22b
:::warning
OpenHands will issue many prompts to the LLM you configure. Most of these LLMs cost money, so be sure to set spending
@@ -68,7 +67,7 @@ You can customize these options as you need for the provider you're using. Check
* `LLM_RETRY_MAX_WAIT` (Default of 120 seconds)
* `LLM_RETRY_MULTIPLIER` (Default of 2)
If you running `openhands` in development mode, you can also set these options to the values you need in `config.toml` file:
If you are running OpenHands in development mode, you can also set these options in the `config.toml` file:
```toml
[llm]

Binary file not shown.

Before

Width:  |  Height:  |  Size: 22 KiB

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 26 KiB

After

Width:  |  Height:  |  Size: 26 KiB