-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Chromie/fix deprecation warning for old model format #1625
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Add unit tests to verify: - UnsupportedModelError includes guidance about provider/model format - LLMProvider logs deprecation warning for legacy model names - Legacy model names still work (non-breaking) - Provider/model format works without deprecation warning - UnsupportedAISDKModelProviderError for invalid providers Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Updates to inform users about the new provider/model format: 1. UnsupportedModelError now includes a note encouraging users to use the provider/model format (e.g., "openai/gpt-4o") instead of the legacy format (e.g., "gpt-4o") 2. Fixed the default case in LLMProvider switch to throw UnsupportedModelProviderError (for internal consistency issues) instead of UnsupportedModelError (which is for user-facing errors) This is a non-breaking change - legacy model names still work but users are now informed about the preferred format. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…provider-map Chromie/fix deprecate model to provider map
|
|
|
||
| this.logger({ | ||
| category: "llm", | ||
| message: `Deprecation warning: Model format "${modelName}" is deprecated. Please use the provider/model format (e.g., "openai/gpt-4o" or "anthropic/claude-3-5-sonnet-latest").`, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| message: `Deprecation warning: Model format "${modelName}" is deprecated. Please use the provider/model format (e.g., "openai/gpt-4o" or "anthropic/claude-3-5-sonnet-latest").`, | |
| message: `Deprecation warning: Model format "${modelName}" is deprecated. Please use the provider/model format (e.g., "openai/gpt-5" or "anthropic/claude-sonnet-4-5").`, |
| const modelList = supportedModels.join(", "); | ||
| const deprecationNote = | ||
| `\n\nNote: The legacy model format (e.g., "gpt-4o") is deprecated. ` + | ||
| `Please use the provider/model format instead (e.g., "openai/gpt-4o", "anthropic/claude-3-5-sonnet-latest").`; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| `Please use the provider/model format instead (e.g., "openai/gpt-4o", "anthropic/claude-3-5-sonnet-latest").`; | |
| `Please use the provider/model format instead (e.g., "openai/gpt-5", "anthropic/claude-sonnet-4-5").`; |
Greptile OverviewGreptile SummaryAdded deprecation warnings for legacy model format (e.g., Key changes:
Confidence Score: 5/5
Important Files Changed
Sequence DiagramsequenceDiagram
participant User
participant LLMProvider
participant Logger
participant ModelMap
participant ClientFactory
User->>LLMProvider: getClient("gpt-4o")
LLMProvider->>LLMProvider: Check if modelName includes "/"
alt New format (contains "/")
LLMProvider->>ClientFactory: Create AISdkClient
ClientFactory-->>LLMProvider: Return client
else Legacy format (no "/")
LLMProvider->>ModelMap: Lookup provider for "gpt-4o"
ModelMap-->>LLMProvider: Return "openai"
alt Model not found
LLMProvider->>User: Throw UnsupportedModelError<br/>(with deprecation note)
else Model found
LLMProvider->>Logger: Log deprecation warning (level 0)
Logger-->>User: Display warning message
LLMProvider->>ClientFactory: Create legacy client (OpenAIClient)
ClientFactory-->>LLMProvider: Return client
end
end
LLMProvider-->>User: Return LLMClient
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No files reviewed, no comments
Updated model examples in deprecation messaging from: - openai/gpt-4o -> openai/gpt-5 - anthropic/claude-3-5-sonnet-latest -> anthropic/claude-sonnet-4 This addresses the PR review feedback requesting more current model names. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
1 issue found across 3 files
Confidence score: 4/5
- Safe to merge overall; the only noted issue is a low-severity test weakness rather than a runtime bug.
packages/core/tests/model-deprecation.test.tsmay silently pass ifgetClientstops throwing because assertions live only inside a try/catch, reducing confidence in the test’s signal.- Pay close attention to
packages/core/tests/model-deprecation.test.ts- assertions can be skipped if no exception is thrown.
Prompt for AI agents (all issues)
Check if these issues are valid — if so, understand the root cause of each and fix them.
<file name="packages/core/tests/model-deprecation.test.ts">
<violation number="1" location="packages/core/tests/model-deprecation.test.ts:132">
P2: Test does not fail if getClient stops throwing; assertions are only inside try/catch, so the error path can be skipped silently.</violation>
</file>
Architecture diagram
sequenceDiagram
participant Client as User Code
participant Provider as LLMProvider
participant Log as Logger
Note over Client, Provider: Flow: resolving model names to clients
Client->>Provider: getClient(modelName, options)
alt modelName contains "/" (e.g. "openai/gpt-4o")
Note over Provider: Standard V3 flow (unchanged)
Provider->>Provider: Instantiate provider from prefix
else CHANGED: Legacy format (e.g. "gpt-4o")
Provider->>Provider: Lookup legacy model map
alt Legacy Mapping Not Found
Note right of Provider: CHANGED: Error message now<br/>explicitly suggests "provider/model" format
Provider--XClient: Throw UnsupportedModelError
else Legacy Mapping Found
Note right of Log: NEW: Level 0 Deprecation Warning<br/>(Suggests new format)
Provider->>Log: log("Deprecation warning: ...")
alt Switch Case: Known Provider (OpenAI/Anthropic)
Provider->>Provider: Instantiate legacy client
else NEW: Switch Default (Provider in map but not in switch)
Provider--XClient: Throw UnsupportedModelProviderError
end
end
end
Provider-->>Client: Return LLMClient Instance
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
| const logger = (line: LogLine) => logs.push(line); | ||
| const provider = new LLMProvider(logger); | ||
|
|
||
| try { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
P2: Test does not fail if getClient stops throwing; assertions are only inside try/catch, so the error path can be skipped silently.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At packages/core/tests/model-deprecation.test.ts, line 132:
<comment>Test does not fail if getClient stops throwing; assertions are only inside try/catch, so the error path can be skipped silently.</comment>
<file context>
@@ -0,0 +1,196 @@
+ const logger = (line: LogLine) => logs.push(line);
+ const provider = new LLMProvider(logger);
+
+ try {
+ provider.getClient("some-unknown-model" as any, mockClientOptions);
+ } catch (error) {
</file context>
why
what changed
test plan
Summary by cubic
Add a deprecation warning for legacy model names (no slash) while keeping them working, and update errors to guide users to the provider/model format. Also improves provider error handling and adds tests.
Written for commit 71ca391. Summary will update on new commits. Review in cubic