Xingyao Wang fc5e075ea0
feat(sandbox): Implementation of Sandbox Plugin to Support Jupyter (#1255)
* initialize plugin definition

* initialize plugin definition

* simplify mixin

* further improve plugin mixin

* add cache dir for pip

* support clean up cache

* add script for setup jupyter and execution server

* integrate JupyterRequirement to ssh_box

* source bashrc at the end of plugin load

* add execute_cli that accept code via stdin

* make JUPYTER_EXEC_SERVER_PORT configurable via env var

* increase background cmd sleep time

* Update opendevin/sandbox/plugins/mixin.py

Co-authored-by: Robert Brennan <accounts@rbren.io>

* add mixin to base class

* make jupyter requirement a dataclass

* source plugins only when >0 requirements

* add `sandbox_plugins` for each agent & have controller take care of it

* update build.sh to make logs available in /opendevin/logs

* switch to use config for lib and cache dir

* fix permission issue with /workspace

* use python to implement execute_cli to avoid stdin escape issue

* wait until jupyter is avaialble

* support plugin via copying instead of mounting

---------

Co-authored-by: Robert Brennan <accounts@rbren.io>
2024-04-23 08:45:53 +08:00
..

CodeAct-based Agent Framework

This folder implements the CodeAct idea that relies on LLM to autonomously perform actions in a Bash shell. It requires more from the LLM itself: LLM needs to be capable enough to do all the stuff autonomously, instead of stuck in an infinite loop.

NOTE: This agent is still highly experimental and under active development to reach the capability described in the original paper & repo.

Demo of the expected capability - work-in-progress.

mkdir workspace
PYTHONPATH=`pwd`:$PYTHONPATH python3 opendevin/main.py -d ./workspace -c CodeActAgent -t "Please write a flask app that returns 'Hello, World\!' at the root URL, then start the app on port 5000. python3 has already been installed for you."

Example: prompts gpt-4-0125-preview to write a flask server, install flask library, and start the server.

image image

Most of the things are working as expected, except at the end, the model did not follow the instruction to stop the interaction by outputting <execute> exit </execute> as instructed.

TODO: This should be fixable by either (1) including a complete in-context example like this, OR (2) collect some interaction data like this and fine-tune a model (like this, a more complex route).