From 9d5bc6b332957a8846de2c5886b3475cea7b82a6 Mon Sep 17 00:00:00 2001 From: Robert Brennan Date: Thu, 18 Apr 2024 11:49:17 -0400 Subject: [PATCH] Update README.md (#1212) --- README.md | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index e288aa25a3..54f882354c 100644 --- a/README.md +++ b/README.md @@ -121,7 +121,8 @@ After completing the MVP, the team will focus on research in various areas, incl ## 🚀 Get Started The easiest way to run OpenDevin is inside a Docker container. -You can run: + +To start the app, run these commands, replacing `$(pwd)/workspace` with the path to the code you want OpenDevin to work with. ```bash # Your OpenAI API key, or any other LLM API key export LLM_API_KEY="sk-..." @@ -135,11 +136,11 @@ docker run \ -v $WORKSPACE_DIR:/opt/workspace_base \ -v /var/run/docker.sock:/var/run/docker.sock \ -p 3000:3000 \ - ghcr.io/opendevin/opendevin:main + ghcr.io/opendevin/opendevin:0.3.1 ``` -Replace `$(pwd)/workspace` with the path to the code you want OpenDevin to work with. +You'll find opendevin running at `http://localhost:3000`. -You can find opendevin running at `http://localhost:3000`. +If you want to use the (unstable!) bleeding edge, you can use `ghcr.io/opendevin/opendevin:main` as the image. See [Development.md](Development.md) for instructions on running OpenDevin without Docker.