Skip to content

Commit c3b3785

Browse files
authored
fix(llm_client): catch context limit error when using mirothinker as main model (#57)
* upd: add futurex evaluation support. * upd: support multiple eval for futurex and add relavent doc. * upd: fix bugs with doc for futurex. * debug: fix wrong calling path. * add preparation for finsearchcomp. * update a premature version of finsearchcomp benchmark. * clean redundent code in merging. * upd: modify yaml to use Mirothinker as the main agent, add check progress file to exclude T1. * upd: check_progress function for finsearchcomp now consider globe and greater china respectively. * upd: add docs and shell script for multiple runs. * fix: check_finsearchcomp_progress not displaying results from greater china region. * fix: catch ContextLimitError in more observed cases.
1 parent 734cabe commit c3b3785

File tree

2 files changed

+4
-0
lines changed

2 files changed

+4
-0
lines changed

src/llm/providers/claude_openrouter_client.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -191,6 +191,8 @@ async def _create_message(
191191
or "exceeds the maximum length" in error_str
192192
or "exceeds the maximum allowed length" in error_str
193193
or "Input tokens exceed the configured limit" in error_str
194+
or "Requested token count exceeds the model's maximum context length" in error_str
195+
or "BadRequestError" in error_str and "context length" in error_str
194196
):
195197
logger.debug(f"OpenRouter LLM Context limit exceeded: {error_str}")
196198
raise ContextLimitError(f"Context limit exceeded: {error_str}")

src/llm/providers/mirothinker_sglang_client.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -159,6 +159,8 @@ async def _create_message(
159159
or "exceeds the maximum length" in error_str
160160
or "exceeds the maximum allowed length" in error_str
161161
or "Input tokens exceed the configured limit" in error_str
162+
or "Requested token count exceeds the model's maximum context length" in error_str
163+
or "BadRequestError" in error_str and "context length" in error_str
162164
):
163165
logger.debug(f"MiroThinker LLM Context limit exceeded: {error_str}")
164166
raise ContextLimitError(f"Context limit exceeded: {error_str}")

0 commit comments

Comments
 (0)