fix(api): resolve model provider from config to prevent misrouting

When the model dropdown sends a prefixed ID like "anthropic/claude-xxx",
AIAgent interprets the provider/model format as an OpenRouter path and
routes through OpenRouter instead of the direct Anthropic API.

Fix: read the configured provider from config.yaml model section. If
the model ID starts with the configured provider name followed by "/",
strip that prefix and pass the provider explicitly to AIAgent. This
ensures direct API providers (Anthropic, OpenAI, etc.) are used when
configured, regardless of the model ID format from the dropdown.
This commit is contained in:
deboste
2026-03-31 14:35:54 +00:00
parent a9ae0b0a83
commit 2864c2b691
2 changed files with 20 additions and 1 deletions

View File

@@ -629,7 +629,15 @@ def _handle_chat_sync(handler, body):
try:
from run_agent import AIAgent
with CHAT_LOCK:
agent = AIAgent(model=s.model, platform='cli', quiet_mode=True,
from api.config import cfg as _hcfg
_model = s.model or ''
_prov = None
_mc = _hcfg.get('model', {})
if isinstance(_mc, dict):
_prov = _mc.get('provider')
if _prov and '/' in _model and _model.startswith(_prov + '/'):
_model = _model.split('/', 1)[1]
agent = AIAgent(model=_model, provider=_prov, platform='cli', quiet_mode=True,
enabled_toolsets=CLI_TOOLSETS, session_id=s.session_id)
workspace_ctx = f"[Workspace: {s.workspace}]\n"
workspace_system_msg = (