mirror of
https://github.com/OpenHands/OpenHands.git
synced 2025-12-26 05:48:36 +08:00
Standardize names and better organization (#3453)
* Standardize names and better organization * Update links
This commit is contained in:
parent
d1b9787751
commit
14a4e45cbb
@ -25,8 +25,8 @@ Les variables d'environnement suivantes peuvent être nécessaires pour certains
|
||||
|
||||
Nous avons quelques guides pour exécuter OpenDevin avec des fournisseurs de modèles spécifiques :
|
||||
|
||||
- [ollama](llms/localLLMs)
|
||||
- [Azure](llms/azureLLMs)
|
||||
- [ollama](llms/local-llms)
|
||||
- [Azure](llms/azure-llms)
|
||||
|
||||
Si vous utilisez un autre fournisseur, nous vous encourageons à ouvrir une PR pour partager votre configuration !
|
||||
|
||||
|
||||
@ -85,7 +85,7 @@ AttributeError: 'NoneType' object has no attribute 'request'
|
||||
[Problèmes GitHub](https://github.com/OpenDevin/OpenDevin/issues?q=is%3Aissue+is%3Aopen+404)
|
||||
|
||||
Cela se produit généralement avec les configurations de LLM *locales*, lorsque OpenDevin ne parvient pas à se connecter au serveur LLM.
|
||||
Consultez notre guide pour [LLMs locaux](llms/localLLMs) pour plus d'informations.
|
||||
Consultez notre guide pour [LLMs locaux](llms/local-llms) pour plus d'informations.
|
||||
|
||||
### Solutions de contournement
|
||||
|
||||
@ -133,9 +133,9 @@ le point de terminaison API avec lequel vous essayez de vous connecter. Cela arr
|
||||
* Si vous êtes en cours d'exécution dans l'interface utilisateur, assurez-vous de définir le `model` dans le modal des paramètres
|
||||
* Si vous êtes en cours d'exécution sans interface (via main.py), assurez-vous de définir `LLM_MODEL` dans votre env/config
|
||||
* Assurez-vous de suivre les instructions spéciales de votre fournisseur de LLM
|
||||
* [ollama](/fr/modules/usage/llms/localLLMs)
|
||||
* [Azure](/fr/modules/usage/llms/azureLLMs)
|
||||
* [Google](/fr/modules/usage/llms/googleLLMs)
|
||||
* [ollama](/fr/modules/usage/llms/local-llms)
|
||||
* [Azure](/fr/modules/usage/llms/azure-llms)
|
||||
* [Google](/fr/modules/usage/llms/google-llms)
|
||||
* Assurez-vous que votre clé API est correcte
|
||||
* Voyez si vous pouvez vous connecter au LLM en utilisant `curl`
|
||||
* Essayez de [vous connecter via LiteLLM directement](https://github.com/BerriAI/litellm) pour tester votre configuration
|
||||
|
||||
@ -25,8 +25,8 @@ OpenDevin 将向你配置的 LLM 发出许多提示。大多数这些 LLM 都是
|
||||
|
||||
我们有一些指南,介绍了如何使用特定模型提供商运行 OpenDevin:
|
||||
|
||||
- [ollama](llms/localLLMs)
|
||||
- [Azure](llms/azureLLMs)
|
||||
- [ollama](llms/local-llms)
|
||||
- [Azure](llms/azure-llms)
|
||||
|
||||
如果你使用其他提供商,我们鼓励你打开一个 PR 来分享你的配置!
|
||||
|
||||
|
||||
@ -81,7 +81,7 @@ AttributeError: 'NoneType' object has no attribute 'request'
|
||||
|
||||
[GitHub 问题](https://github.com/OpenDevin/OpenDevin/issues?q=is%3Aissue+is%3Aopen+404)
|
||||
|
||||
这通常发生在本地 LLM 设置中,当 OpenDevin 无法连接到 LLM 服务器时。请参阅我们的 [本地 LLM 指南](llms/localLLMs) 以获取更多信息。
|
||||
这通常发生在本地 LLM 设置中,当 OpenDevin 无法连接到 LLM 服务器时。请参阅我们的 [本地 LLM 指南](llms/local-llms) 以获取更多信息。
|
||||
|
||||
### 解决方法
|
||||
|
||||
@ -128,9 +128,9 @@ openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Re
|
||||
* 如果您在 UI 中运行,请确保在设置模式中设置 `model`
|
||||
* 如果您通过 main.py 运行,请确保在环境变量/配置中设置 `LLM_MODEL`
|
||||
* 确保遵循了您的 LLM 提供商的任何特殊说明
|
||||
* [Ollama](/zh-Hans/modules/usage/llms/localLLMs)
|
||||
* [Azure](/zh-Hans/modules/usage/llms/azureLLMs)
|
||||
* [Google](/zh-Hans/modules/usage/llms/googleLLMs)
|
||||
* [Ollama](/zh-Hans/modules/usage/llms/local-llms)
|
||||
* [Azure](/zh-Hans/modules/usage/llms/azure-llms)
|
||||
* [Google](/zh-Hans/modules/usage/llms/google-llms)
|
||||
* 确保您的 API 密钥正确无误
|
||||
* 尝试使用 `curl` 连接到 LLM
|
||||
* 尝试[直接通过 LiteLLM 连接](https://github.com/BerriAI/litellm)来测试您的设置
|
||||
|
||||
@ -2,7 +2,7 @@
|
||||
sidebar_position: 7
|
||||
---
|
||||
|
||||
# 🏛️ System Architecture Overview
|
||||
# 🏛️ System Architecture
|
||||
|
||||
<div style={{ textAlign: 'center' }}>
|
||||
<img src="https://github.com/OpenDevin/OpenDevin/assets/16201837/97d747e3-29d8-4ccb-8d34-6ad1adb17f38" alt="OpenDevin System Architecture Diagram Jul 4 2024" />
|
||||
@ -1,7 +1,3 @@
|
||||
---
|
||||
sidebar_position: 4
|
||||
---
|
||||
|
||||
# 📦 EventStream Runtime
|
||||
|
||||
The OpenDevin EventStream Runtime is the core component that enables secure and flexible execution of AI agent's action.
|
||||
@ -1,8 +1,4 @@
|
||||
---
|
||||
sidebar_position: 6
|
||||
---
|
||||
|
||||
# 💿 How to Create and Use a Custom Docker Sandbox
|
||||
# Create and Use a Custom Docker Sandbox
|
||||
|
||||
The default OpenDevin sandbox comes with a [minimal ubuntu configuration](https://github.com/OpenDevin/OpenDevin/blob/main/containers/sandbox/Dockerfile).
|
||||
|
||||
@ -1,8 +1,4 @@
|
||||
---
|
||||
sidebar_position: 6
|
||||
---
|
||||
|
||||
# 📈 How to contribute to OpenDevin Evaluation Harness
|
||||
# Contribute to OpenDevin Evaluation Harness
|
||||
|
||||
This guide provides an overview of how to integrate your own evaluation benchmark into the OpenDevin framework.
|
||||
|
||||
5
docs/modules/usage/how-to/how-to.md
Normal file
5
docs/modules/usage/how-to/how-to.md
Normal file
@ -0,0 +1,5 @@
|
||||
---
|
||||
sidebar_position: 6
|
||||
---
|
||||
|
||||
# 🔎 How To Section
|
||||
@ -1,8 +1,4 @@
|
||||
---
|
||||
sidebar_position: 6
|
||||
---
|
||||
|
||||
# 💿 How to use OpenDevin in OpenShift/K8S
|
||||
# Use OpenDevin in OpenShift/K8S
|
||||
|
||||
There are different ways and scenarios that you can do, we're just mentioning one example here:
|
||||
1. Create a PV "as a cluster admin" to map workspace_base data and docker directory to the pod through the worker node.
|
||||
@ -26,8 +26,8 @@ The following environment variables might be necessary for some LLMs/providers:
|
||||
|
||||
We have a few guides for running OpenDevin with specific model providers:
|
||||
|
||||
- [ollama](llms/localLLMs)
|
||||
- [Azure](llms/azureLLMs)
|
||||
- [ollama](llms/local-llms)
|
||||
- [Azure](llms/azure-llms)
|
||||
|
||||
If you're using another provider, we encourage you to open a PR to share your setup!
|
||||
|
||||
|
||||
@ -96,7 +96,7 @@ AttributeError: 'NoneType' object has no attribute 'request'
|
||||
[GitHub Issues](https://github.com/OpenDevin/OpenDevin/issues?q=is%3Aissue+is%3Aopen+404)
|
||||
|
||||
This usually happens with *local* LLM setups, when OpenDevin can't connect to the LLM server.
|
||||
See our guide for [local LLMs](llms/localLLMs) for more information.
|
||||
See our guide for [local LLMs](llms/local-llms) for more information.
|
||||
|
||||
**Workarounds**
|
||||
|
||||
@ -145,9 +145,9 @@ the API endpoint you're trying to connect to. Most often this happens for Azure
|
||||
* If you're running inside the UI, be sure to set the `model` in the settings modal
|
||||
* If you're running headless (via main.py) be sure to set `LLM_MODEL` in your env/config
|
||||
* Make sure you've followed any special instructions for your LLM provider
|
||||
* [ollama](/modules/usage/llms/localLLMs)
|
||||
* [Azure](/modules/usage/llms/azureLLMs)
|
||||
* [Google](/modules/usage/llms/googleLLMs)
|
||||
* [ollama](/modules/usage/llms/local-llms)
|
||||
* [Azure](/modules/usage/llms/azure-llms)
|
||||
* [Google](/modules/usage/llms/google-llms)
|
||||
* Make sure your API key is correct
|
||||
* See if you can connect to the LLM using `curl`
|
||||
* Try [connecting via LiteLLM directly](https://github.com/BerriAI/litellm) to test your setup
|
||||
|
||||
@ -2,7 +2,7 @@
|
||||
sidebar_position: 8
|
||||
---
|
||||
|
||||
# Upgrade Guide
|
||||
# ⬆️ Upgrade Guide
|
||||
|
||||
## 0.8.0 (2024-07-13)
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user