Polly
27c49471a8
Fix Bug #8425 - Enable prompt cache for OpenRouter model of calude-3.7-sonnet ( #8426 )
2025-05-11 00:07:31 +02:00
Polly
90aab29bc0
Fix Issue #8413 max_output_tokens in openrouter/anthropic/claude-3.7-sonnet doesn't work correctly ( #8415 )
2025-05-10 08:29:39 +00:00
Graham Neubig
b5dbf81179
Fix typing issues in openhands/llm directory ( #8377 )
...
Co-authored-by: openhands <openhands@all-hands.dev>
2025-05-09 18:26:59 +00:00
AutoLTX
3d68711ca3
Display context window usage status in UI ( #8267 )
2025-05-09 11:39:14 +08:00
Graham Neubig
4ff43d1d99
Add claude-sonnet-latest to supported models lists ( #8365 )
...
Co-authored-by: openhands <openhands@all-hands.dev>
Co-authored-by: Engel Nyst <enyst@users.noreply.github.com>
2025-05-09 00:08:09 +00:00
Graham Neubig
d5a8d4251c
Fix missing comma in hosts ( #8345 )
2025-05-07 21:54:15 -04:00
Engel Nyst
9b1aaa53fe
Reset the reset ( #8079 )
2025-04-25 00:59:37 +02:00
Kenny Dizi
85e2b73eb4
[Feat] Support o3 and o4 mini ( #7898 )
...
Co-authored-by: Xingyao Wang <xingyao@all-hands.dev>
2025-04-18 21:23:19 +00:00
Xingyao Wang
e69ae81ad2
Add GPT-4.1 to function calling list ( #7866 )
...
Co-authored-by: Juan Michelini <juan@juan.com.uy>
Co-authored-by: openhands <openhands@all-hands.dev>
2025-04-15 18:12:15 +02:00
NarwhalChen
513f7ab7e7
fix(llm): ensure base_url has protocol prefix for model info fetch when using LiteLLM ( #7782 )
...
Co-authored-by: Engel Nyst <enyst@users.noreply.github.com>
Co-authored-by: sp.wack <83104063+amanape@users.noreply.github.com>
2025-04-10 20:10:06 +04:00
Graham Neubig
9b8a628395
Add more extensive typing to openhands/llm/ directory ( #7727 )
...
Co-authored-by: openhands <openhands@all-hands.dev>
2025-04-06 17:59:25 +00:00
Xingyao Wang
0ab9d97f2d
fix(llm): retry on InternalServerError and Timeout; handle Gemini returns len(choices) < 1 ( #7713 )
...
Co-authored-by: Engel Nyst <enyst@users.noreply.github.com>
Co-authored-by: openhands <openhands@all-hands.dev>
2025-04-05 21:33:17 +00:00
Engel Nyst
d7f651a06c
Fix cached tokens ( #7679 )
...
Co-authored-by: OpenHands Bot <openhands@all-hands.dev>
2025-04-03 01:11:18 +00:00
Xingyao Wang
648c8ffb21
(llm): Support OpenHands LM ( #7598 )
...
Co-authored-by: mamoodi <mamoodiha@gmail.com>
2025-03-31 17:29:31 +00:00
tofarr
1230b229b5
Replace use of requests with httpx ( #7354 )
2025-03-26 13:37:10 +00:00
மனோஜ்குமார் பழனிச்சாமி
2518901e6e
feat: Support seed parameter ( #7441 )
2025-03-24 19:18:00 +01:00
Xingyao Wang
6f9ced1c23
[Observability] add metadata to track llm request for sessions ( #7381 )
...
Co-authored-by: Robert Brennan <accounts@rbren.io>
2025-03-23 04:20:38 +08:00
Engel Nyst
cc45f5d9c3
Add RecallActions and observations for retrieval of prompt extensions ( #6909 )
...
Co-authored-by: openhands <openhands@all-hands.dev>
Co-authored-by: Calvin Smith <email@cjsmith.io>
2025-03-15 21:48:37 +01:00
Robert Brennan
366fd7ab8a
Improve agent loop tracking and make concurrent limit configurable ( #6945 )
...
Co-authored-by: openhands <openhands@all-hands.dev>
Co-authored-by: Tim O'Farrell <tofarr@gmail.com>
2025-03-07 09:19:50 -07:00
Xingyao Wang
b146b63380
chore: temporary fix to get sonnet 3.7 working again ( #7140 )
2025-03-06 20:30:35 +00:00
Engel Nyst
285010b48f
OpenAI models fixes ( #7045 )
2025-03-03 15:53:18 +01:00
Engel Nyst
8b234ae57c
Azure completion_tokens fix (take two) ( #6975 )
2025-02-27 02:28:01 +01:00
tofarr
b38039e626
Fix fd leak ( #6950 )
2025-02-26 09:35:38 -07:00
Graham Neubig
6ba79c454b
feat(llm): Add Claude 3.7 backend configurations ( #6937 )
...
Co-authored-by: openhands <openhands@all-hands.dev>
2025-02-25 16:46:53 +00:00
tofarr
f4c5bbda19
Revert "Fix file descriptor leak ( #6897 )" ( #6921 )
2025-02-24 13:47:13 -05:00
tofarr
29ba94fc0f
Fix file descriptor leak ( #6897 )
...
Co-authored-by: openhands <openhands@all-hands.dev>
2025-02-24 08:35:10 -07:00
Engel Nyst
2d2dbf1561
Use LLM APIs responses in token counting ( #5604 )
...
Co-authored-by: Calvin Smith <email@cjsmith.io>
2025-02-23 17:58:47 +01:00
Engel Nyst
bf82f75ae4
Revert "Fix: File Descriptor leak" ( #6887 )
2025-02-22 11:21:02 +00:00
tofarr
a20f299579
Fix: File Descriptor leak ( #6883 )
2025-02-21 14:47:59 -07:00
Engel Nyst
eed7e2dd6e
Refactor I/O utils; allow 'task' command line parameter in cli.py ( #6187 )
...
Co-authored-by: OpenHands Bot <openhands@all-hands.dev>
2025-02-19 22:10:14 +01:00
Robert Brennan
3a478c2303
Better LLM retry behavior ( #6557 )
...
Co-authored-by: Engel Nyst <enyst@users.noreply.github.com>
2025-02-17 10:36:59 -05:00
Cheng Yang
63565982aa
docs: improve docstrings for CLI and config utils ( #5398 )
...
Co-authored-by: Engel Nyst <enyst@users.noreply.github.com>
2025-02-15 01:51:59 +00:00
Engel Nyst
1a715d2ec4
Clean up global in llm.py (we figured it's not needed) ( #6675 )
2025-02-11 00:00:46 +01:00
Xingyao Wang
52ac2729f7
fix: set tool_choice to none for non-fncall models ( #6652 )
2025-02-07 12:49:08 -05:00
Engel Nyst
0d312a645a
Simplify fn calling usage ( #6596 )
2025-02-04 22:54:38 +01:00
Xingyao Wang
622fc5213d
[feat] support o3-mini ( #6570 )
2025-02-03 23:26:35 +08:00
Rick van Hattem
4ef09ab897
Update llm.py ( #6582 )
2025-02-02 03:24:46 +00:00
Aditya Bharat Soni
a593d9bc6d
Visual browsing in CodeAct using set-of-marks annotated webpage screenshots ( #6464 )
2025-02-02 04:56:11 +08:00
Engel Nyst
eb8d1600c3
Chore: clean up LLM (prompt caching, supports fn calling), leftover renames ( #6095 )
2025-02-01 18:14:08 +01:00
Ray Myers
fd73f4210e
Show LLM retries and allow resume from rate-limit state ( #6438 )
...
Co-authored-by: Engel Nyst <enyst@users.noreply.github.com>
2025-01-30 21:51:47 +00:00
Engel Nyst
89963e93d8
Re-add reasoning effort ( #6371 )
2025-01-21 04:22:48 +01:00
Calvin Smith
a12087243a
Pydantic-based configuration and setting objects ( #6321 )
...
Co-authored-by: Calvin Smith <calvin@all-hands.dev>
Co-authored-by: Graham Neubig <neubig@gmail.com>
Co-authored-by: Engel Nyst <enyst@users.noreply.github.com>
2025-01-17 12:33:22 -07:00
Xingyao Wang
e211647eba
fix: llm-proxy response_cost being 0 ( #6293 )
2025-01-16 15:33:22 +00:00
Alejandro Cuadron Lafuente
8579710c82
[Fix] Restored FC default for GPT-4o ( #6311 )
2025-01-16 15:27:57 +00:00
Alejandro Cuadron Lafuente
578291e961
Enabled native function calling for O1 + added support for reasoning_effort config in the config. ( #6256 )
...
Co-authored-by: Engel Nyst <enyst@users.noreply.github.com>
2025-01-16 14:53:11 +00:00
tofarr
23473070b9
Revert "Config objects as Pydantic BaseModels ( #6176 )" ( #6214 )
2025-01-13 07:36:25 -07:00
Calvin Smith
873dddb4e8
Config objects as Pydantic BaseModels ( #6176 )
...
Co-authored-by: Calvin Smith <calvin@all-hands.dev>
Co-authored-by: Graham Neubig <neubig@gmail.com>
2025-01-12 15:09:45 -05:00
Engel Nyst
3d2138d9ce
Command line args fixes ( #5990 )
2025-01-05 02:58:26 +00:00
Xingyao Wang
aaff3dd075
fix(llm): cost metrics calculation for unsupport litellm prefix ( #6022 )
2025-01-04 18:09:13 +00:00
Engel Nyst
c567c11267
Enable/disable function calling by user configuration ( #5992 )
...
Co-authored-by: co <yc5@tju.edu.cn>
Co-authored-by: Cheng Yang <93481273+young010101@users.noreply.github.com>
2025-01-03 01:40:49 +01:00