Root cause: resolve_model_provider() had a branch:
if config_provider and config_provider != 'openrouter' and prefix in _PROVIDER_MODELS:
return bare, prefix, None
When Camanji profile (config_provider='anthropic') picked openai/gpt-5.4-mini
from the OpenRouter dropdown, prefix='openai' matched _PROVIDER_MODELS and
config_provider was not 'openrouter', so it returned ('gpt-5.4-mini', 'openai', None).
The agent then demanded OPENAI_API_KEY directly -- not found -- RuntimeError --
stream crashed -- 'Connection lost'.
Fix: if prefix != config_provider (cross-provider selection), always route through
openrouter with the full provider/model string. Only strip the prefix and call a
direct provider API when the config_provider EXACTLY matches the model prefix.
Cases verified:
openrouter + openai/gpt-5.4-mini -> (openai/gpt-5.4-mini, openrouter) ✓
anthropic + openai/gpt-5.4-mini -> (openai/gpt-5.4-mini, openrouter) ✓ FIXED
anthropic + anthropic/claude-... -> (claude-..., anthropic) ✓
anthropic + claude-sonnet-4-6 bare -> (claude-sonnet-4-6, anthropic) ✓
openrouter + anthropic/claude-... -> (anthropic/claude-..., openrouter) ✓
Tests: 426 passed, 0 failed.
29 KiB
29 KiB