mirror of
https://github.com/OpenHands/OpenHands.git
synced 2025-12-26 05:48:36 +08:00
Co-authored-by: openhands <openhands@all-hands.dev> Co-authored-by: Engel Nyst <enyst@users.noreply.github.com>
24 lines
1.3 KiB
Plaintext
24 lines
1.3 KiB
Plaintext
---
|
|
title: Groq
|
|
description: OpenHands uses LiteLLM to make calls to chat models on Groq. You can find their documentation on using Groq as a provider [here](https://docs.litellm.ai/docs/providers/groq).
|
|
---
|
|
|
|
## Configuration
|
|
|
|
When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings under the `LLM` tab:
|
|
- `LLM Provider` to `Groq`
|
|
- `LLM Model` to the model you will be using. [Visit here to see the list of
|
|
models that Groq hosts](https://console.groq.com/docs/models). If the model is not in the list,
|
|
enable `Advanced` options, and enter it in `Custom Model` (e.g. groq/<model-name> like `groq/llama3-70b-8192`).
|
|
- `API key` to your Groq API key. To find or create your Groq API Key, [see here](https://console.groq.com/keys).
|
|
|
|
## Using Groq as an OpenAI-Compatible Endpoint
|
|
|
|
The Groq endpoint for chat completion is [mostly OpenAI-compatible](https://console.groq.com/docs/openai). Therefore, you can access Groq models as you
|
|
would access any OpenAI-compatible endpoint. In the OpenHands UI through the Settings under the `LLM` tab:
|
|
1. Enable `Advanced` options
|
|
2. Set the following:
|
|
- `Custom Model` to the prefix `openai/` + the model you will be using (e.g. `openai/llama3-70b-8192`)
|
|
- `Base URL` to `https://api.groq.com/openai/v1`
|
|
- `API Key` to your Groq API key
|