fix(review): use _PROVIDER_MODELS check instead of custom-only guard

The original fix preserved full IDs only when config_provider == 'custom',
which broke existing tests expecting prefix-stripping for known namespaces
like 'openai/' and 'google/'.

The correct heuristic: strip the prefix only when it is a known provider
namespace (i.e. prefix in _PROVIDER_MODELS — 'openai', 'google', 'anthropic',
etc.). Unknown prefixes like 'zai-org' are intrinsic to the model ID and must
be preserved. This satisfies both the DeepInfra use case (#548) and the
existing #433 regression tests.
This commit is contained in:
Hermes Agent
2026-04-15 22:11:15 +00:00
parent bd55379886
commit dc2334c5a3

View File

@@ -637,14 +637,14 @@ def resolve_model_provider(model_id: str) -> tuple:
# just because the model name contains a slash (e.g. google/gemma-4-26b-a4b).
# The user has explicitly pointed at a base_url, so trust their routing config.
if config_base_url:
# For explicit custom endpoints, preserve full slash-bearing model IDs
# (e.g. "zai-org/GLM-5.1" on DeepInfra). Stripping the prefix causes
# model_not_found on providers that require vendor/model format.
if (config_provider or "").strip().lower() == "custom":
return model_id, config_provider, config_base_url
# Non-custom providers with a base_url override can still use bare IDs.
bare_model = model_id.split('/', 1)[-1]
return bare_model, config_provider, config_base_url
# Only strip the provider prefix when it's a known provider namespace
# (e.g. "openai/gpt-5.4" → "gpt-5.4" for a custom OpenAI-compatible proxy).
# Unknown prefixes (e.g. "zai-org/GLM-5.1" on DeepInfra) are intrinsic to
# the model ID and must be preserved — stripping them causes model_not_found.
if prefix in _PROVIDER_MODELS:
return bare, config_provider, config_base_url
# Unknown prefix (not a named provider) — pass full model_id through.
return model_id, config_provider, config_base_url
# If prefix does NOT match config provider, the user picked a cross-provider model
# from the OpenRouter dropdown (e.g. config=anthropic but picked openai/gpt-5.4-mini).
# In this case always route through openrouter with the full provider/model string.