Robert Brennan 4aa24eb41d
Server working with agent library (#97)
* server working with agent library

* update readme

* add messages to events

* factor out steps

* fix websocket messages

* allow user to run arbitrary actions

* allow user to run commands before a task is started

* fix main.py

* check JSON

* handle errors in controller better

* fix memory issue

* better error handling and task cancellation

* fix monologue len

* fix imports

* remove server from lint check

* fix lint issues

* fix lint errors
2024-03-24 22:24:44 +08:00
..

CodeAct-based Agent Framework

This folder implements the CodeAct idea that relies on LLM to autonomously perform actions in a Bash shell. It requires more from the LLM itself: LLM needs to be capable enough to do all the stuff autonomously, instead of stuck in an infinite loop.

A minimalistic exmaple can be found at research/codeact/examples/run_flask_server_with_bash.py:

mkdir workspace
PYTHONPATH=`pwd`:$PYTHONPATH python3 opendevin/main.py -d ./workspace -c CodeActAgent -t "Please write a flask app that returns 'Hello, World\!' at the root URL, then start the app on port 5000. python3 has already been installed for you."

Example: prompts gpt-3.5-turbo-0125 to write a flask server, install flask library, and start the server.

image image

Most of the things are working as expected, except at the end, the model did not follow the instruction to stop the interaction by outputting <execute> exit </execute> as instructed.

TODO: This should be fixable by either (1) including a complete in-context example like this, OR (2) collect some interaction data like this and fine-tune a model (like this, a more complex route).