generated from obsidianmd/obsidian-sample-plugin
-
Notifications
You must be signed in to change notification settings - Fork 3
Open
Description
Summary
Resolved Codex API 400 errors that prevented classification from working. This issue documents the debugging process and solution for future reference.
Problem
When using the Codex provider for classification, users encountered HTTP 400 errors with no visible error message:
Streaming request failed (HTTP 400): Request failed, status 400
Obsidian's requestUrl throws exceptions on 4xx errors without exposing the response body, making debugging difficult.
Root Cause Analysis
After comparing our implementation with Smart Composer, we identified three issues:
1. HTTP Client Choice
- Problem: Obsidian's
requestUrldoesn't properly support streaming and throws on 4xx errors without exposing response body - Solution: Use Node's native
httpsmodule directly (like Smart Composer does) - Reference: Smart Composer's httpTransport.ts
2. Header Case Sensitivity
- Problem: We used
ChatGPT-Account-ID(uppercase 'ID') - Solution: Changed to
ChatGPT-Account-Id(lowercase 'd') to match Smart Composer - Reference: Smart Composer's openaiCodexProvider.ts
3. Unsupported Parameter
- Problem: Codex Responses API does not support the
temperatureparameter - Solution: Removed temperature from Codex request body
- Reference: Smart Composer's codexMessageAdapter.ts - no temperature in
buildRequestBody
4. Model Availability
- Finding: Only
gpt-5.2works consistently;gpt-5-mini,gpt-4.1-mini,o4-minireturn errors - Solution: Updated presets to only show
gpt-5.2as the recommended model
Solution Implementation
Before (Problematic)
// Using Obsidian's requestUrl
const response = await requestUrl({
url,
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${token}`,
'ChatGPT-Account-ID': accountId, // Wrong case
},
body: JSON.stringify({
model,
instructions,
input: [...],
stream: true,
store: false,
temperature: 0.6, // Not supported
}),
});After (Working)
// Using Node's https module like Smart Composer
const https = require('https');
const request = https.request({
hostname: 'chatgpt.com',
path: '/backend-api/codex/responses',
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Length': payloadLength.toString(),
Authorization: `Bearer ${token}`,
'ChatGPT-Account-Id': accountId, // Correct case
},
}, callback);
// Request body without temperature
{
model: 'gpt-5.2',
instructions,
input: [...],
stream: true,
store: false,
// No temperature parameter
}Why This Solution Works
- Node's https module bypasses CORS restrictions and provides full access to response bodies (including error messages)
- Correct header case ensures the server properly identifies the account
- Removing temperature avoids "invalid parameter" errors from the Codex API
- Using gpt-5.2 ensures model availability (other models may require higher subscription tiers)
Commits
- fix: remove temperature parameter from Codex API requests
- fix: use Node's https module for Codex API like Smart Composer
- fix: ChatGPT-Account-Id header case (lowercase 'd')
- chore: update Codex presets to only include gpt-5.2
References
- Smart Composer Plugin: https://github.com/glowingjade/obsidian-smart-composer
- Key files referenced:
src/utils/llm/httpTransport.ts- Node https implementationsrc/core/llm/openaiCodexProvider.ts- Header constructionsrc/core/llm/codexMessageAdapter.ts- Request body format
Lessons Learned
- When Obsidian's
requestUrlthrows on errors, consider using Node's native http/https modules for better debugging - Custom headers like
ChatGPT-Account-Idmay be case-sensitive even though HTTP headers are traditionally case-insensitive - Always compare implementations with working reference projects when debugging API integration issues
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels