docs: Update documentation to promote uv as recommended installation method (#10291)

Co-authored-by: openhands <openhands@all-hands.dev>
This commit is contained in:
Xingyao Wang 2025-08-13 19:11:02 -04:00 committed by GitHub
parent 4f436922ca
commit 5e85986f32
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
3 changed files with 90 additions and 28 deletions

View File

@ -52,14 +52,31 @@ which comes with $20 in free credits for new users.
## 💻 Running OpenHands Locally
OpenHands can also run on your local system using Docker.
See the [Running OpenHands](https://docs.all-hands.dev/usage/installation) guide for
system requirements and more information.
### Option 1: CLI Launcher (Recommended)
> [!WARNING]
> On a public network? See our [Hardened Docker Installation Guide](https://docs.all-hands.dev/usage/runtimes/docker#hardened-docker-installation)
> to secure your deployment by restricting network binding and implementing additional security measures.
The easiest way to run OpenHands locally is using the CLI launcher with [uv](https://docs.astral.sh/uv/). This provides better isolation from your current project's virtual environment and is required for OpenHands' default MCP servers.
**Install uv** (if you haven't already):
See the [uv installation guide](https://docs.astral.sh/uv/getting-started/installation/) for the latest installation instructions for your platform.
**Launch OpenHands**:
```bash
# Launch the GUI server
uvx --python 3.12 --from openhands-ai openhands serve
# Or launch the CLI
uvx --python 3.12 --from openhands-ai openhands
```
You'll find OpenHands running at [http://localhost:3000](http://localhost:3000) (for GUI mode)!
### Option 2: Docker
<details>
<summary>Click to expand Docker command</summary>
You can also run OpenHands directly with Docker:
```bash
docker pull docker.all-hands.dev/all-hands-ai/runtime:0.52-nikolaik
@ -75,14 +92,23 @@ docker run -it --rm --pull=always \
docker.all-hands.dev/all-hands-ai/openhands:0.52
```
</details>
> **Note**: If you used OpenHands before version 0.44, you may want to run `mv ~/.openhands-state ~/.openhands` to migrate your conversation history to the new location.
You'll find OpenHands running at [http://localhost:3000](http://localhost:3000)!
> [!WARNING]
> On a public network? See our [Hardened Docker Installation Guide](https://docs.all-hands.dev/usage/runtimes/docker#hardened-docker-installation)
> to secure your deployment by restricting network binding and implementing additional security measures.
### Getting Started
When you open the application, you'll be asked to choose an LLM provider and add an API key.
[Anthropic's Claude Sonnet 4](https://www.anthropic.com/api) (`anthropic/claude-sonnet-4-20250514`)
works best, but you have [many options](https://docs.all-hands.dev/usage/llms).
See the [Running OpenHands](https://docs.all-hands.dev/usage/installation) guide for
system requirements and more information.
## 💡 Other ways to run OpenHands
> [!WARNING]
@ -93,8 +119,8 @@ works best, but you have [many options](https://docs.all-hands.dev/usage/llms).
> [OpenHands Cloud Helm Chart](https://github.com/all-Hands-AI/OpenHands-cloud)
You can [connect OpenHands to your local filesystem](https://docs.all-hands.dev/usage/runtimes/docker#connecting-to-your-filesystem),
run OpenHands in a scriptable [headless mode](https://docs.all-hands.dev/usage/how-to/headless-mode),
interact with it via a [friendly CLI](https://docs.all-hands.dev/usage/how-to/cli-mode),
run OpenHands in a scriptable [headless mode](https://docs.all-hands.dev/usage/how-to/headless-mode),
or run it on tagged issues with [a github action](https://docs.all-hands.dev/usage/how-to/github-action).
Visit [Running OpenHands](https://docs.all-hands.dev/usage/installation) for more information and setup instructions.

View File

@ -20,27 +20,42 @@ for scripting.
### Running with Python
**Note** - OpenHands requires Python version 3.12 or higher (Python 3.14 is not currently supported) and `uvx` for the default `fetch` MCP server (more details below).
**Note** - OpenHands requires Python version 3.12 or higher (Python 3.14 is not currently supported) and `uv` for the default `fetch` MCP server (more details below).
1. Install OpenHands using pip:
```bash
pip install openhands-ai
```
#### Recommended: Using uv
Or if you prefer not to manage your own Python environment, you can use `uvx`:
We recommend using [uv](https://docs.astral.sh/uv/) for the best OpenHands experience. uv provides better isolation from your current project's virtual environment and is required for OpenHands' default MCP servers.
1. **Install uv** (if you haven't already):
See the [uv installation guide](https://docs.astral.sh/uv/getting-started/installation/) for the latest installation instructions for your platform.
2. **Launch OpenHands CLI**:
```bash
uvx --python 3.12 --from openhands-ai openhands
```
<AccordionGroup>
<Accordion title="Alternative: Traditional pip installation">
If you prefer to use pip:
```bash
# Install OpenHands
pip install openhands-ai
```
Note that you'll still need `uv` installed for the default MCP servers to work properly.
</Accordion>
<Accordion title="Create shell aliases for easy access across environments">
Add the following to your shell configuration file (`.bashrc`, `.zshrc`, etc.):
```bash
# Add OpenHands aliases
# Add OpenHands aliases (recommended)
alias openhands="uvx --python 3.12 --from openhands-ai openhands"
alias oh="uvx --python 3.12 --from openhands-ai openhands"
```
@ -72,9 +87,10 @@ source ~/.bashrc # or source ~/.zshrc
</AccordionGroup>
2. Launch an interactive OpenHands conversation from the command line:
3. Launch an interactive OpenHands conversation from the command line:
```bash
openhands
# If using uvx (recommended)
uvx --python 3.12 --from openhands-ai openhands
```
<Note>
@ -83,7 +99,7 @@ openhands
poetry run openhands
</Note>
3. Set your model, API key, and other preferences using the UI (or alternatively environment variables, below).
4. Set your model, API key, and other preferences using the UI (or alternatively environment variables, below).
This command opens an interactive prompt where you can type tasks or commands and get responses from OpenHands.
The first time you run the CLI, it will take you through configuring the required LLM

View File

@ -66,9 +66,31 @@ A system with a modern processor and a minimum of **4GB RAM** is recommended to
### Start the App
#### Option 1: Using the CLI Launcher (Recommended)
#### Option 1: Using the CLI Launcher with uv (Recommended)
If you have Python 3.12+ installed, you can use the CLI launcher for a simpler experience:
We recommend using [uv](https://docs.astral.sh/uv/) for the best OpenHands experience. uv provides better isolation from your current project's virtual environment and is required for OpenHands' default MCP servers (like the [fetch MCP server](https://github.com/modelcontextprotocol/servers/tree/main/src/fetch)).
**Install uv** (if you haven't already):
See the [uv installation guide](https://docs.astral.sh/uv/getting-started/installation/) for the latest installation instructions for your platform.
**Launch OpenHands**:
```bash
# Launch the GUI server
uvx --python 3.12 --from openhands-ai openhands serve
# Or with GPU support (requires nvidia-docker)
uvx --python 3.12 --from openhands-ai openhands serve --gpu
# Or with current directory mounted
uvx --python 3.12 --from openhands-ai openhands serve --mount-cwd
```
This will automatically handle Docker requirements checking, image pulling, and launching the GUI server. The `--gpu` flag enables GPU support via nvidia-docker, and `--mount-cwd` mounts your current directory into the container.
<Accordion title="Alternative: Traditional pip installation">
If you prefer to use pip and have Python 3.12+ installed:
```bash
# Install OpenHands
@ -76,20 +98,16 @@ pip install openhands-ai
# Launch the GUI server
openhands serve
# Or with GPU support (requires nvidia-docker)
openhands serve --gpu
# Or with current directory mounted
openhands serve --mount-cwd
```
Or using `uvx --python 3.12 --from openhands-ai openhands serve` if you have [uv](https://docs.astral.sh/uv/) installed.
Note that you'll still need `uv` installed for the default MCP servers to work properly.
This will automatically handle Docker requirements checking, image pulling, and launching the GUI server. The `--gpu` flag enables GPU support via nvidia-docker, and `--mount-cwd` mounts your current directory into the container.
</Accordion>
#### Option 2: Using Docker Directly
<Accordion title="Docker Command (Click to expand)">
```bash
docker pull docker.all-hands.dev/all-hands-ai/runtime:0.52-nikolaik
@ -104,6 +122,8 @@ docker run -it --rm --pull=always \
docker.all-hands.dev/all-hands-ai/openhands:0.52
```
</Accordion>
> **Note**: If you used OpenHands before version 0.44, you may want to run `mv ~/.openhands-state ~/.openhands` to migrate your conversation history to the new location.
You'll find OpenHands running at http://localhost:3000!