Issue Identified: POST requests to Ollama from Thunderbird extension background context return HTTP 403, while GET requests (test connection) work fine.
- Fixed JSON.parse error when response body is empty
- Now gracefully handles non-JSON error responses
- Specific 403 auth error message for better debugging
- Attempts to parse error response body safely regardless of content-type
- Added
authTokenparameter to constructor - Created
getHeaders()method that includes Authorization header if token is provided - Both
fetchModels()andfetchResponse()now use the auth-aware headers
- js/workers/ollama-worker.js: Now accepts and passes
ollama_auth_tokento Ollama class - api_ollama/ollama-popup.js: Updated to receive auth token from background message
- Test Connection: Uses GET
/api/tags→ Returns 200 ✓ - Analysis: Uses POST
/api/chat→ Returns 403 ✗
- Ollama server configured with access restrictions - Some Ollama deployments have security policies that allow reads but restrict writes
- Different network context - Background.js may have different network permissions than popup
- Missing or incorrect auth token - POST requests might require explicit authentication
- CORS/Security Headers - Extension context might trigger server-side security policies
- OPTIONS preflight handling - Browser might be sending OPTIONS request before POST
curl -X POST http://localhost:11434/api/chat \
-H "Content-Type: application/json" \
-d '{"model":"tinyllama","messages":[{"role":"user","content":"test"}],"stream":false}'
# Result: 200 OK, proper response ✓Go to AutoSort+ Settings → Ollama → Check if "Auth Token" field has a value
- If empty: Try adding a test token or clearing it completely
- Look at browser console for: "Using Ollama at http://localhost:11434..."
# If Ollama running in terminal, check for 403/auth errors
# Or if using container: docker logs <container-id>Try adding this to background.js console temporarily:
const res = await fetch('http://localhost:11434/api/chat', {
method: 'POST',
headers: {'Content-Type': 'application/json'},
body: JSON.stringify({
model: 'tinyllama',
messages: [{role:'user', content:'test'}],
stream: false
})
});
console.log('Direct fetch status:', res.status);
const data = await res.json();
console.log('Response:', data);- background.js - Better error handling, removed broken tab proxy
- js/ollama.js - Added auth token support
- js/workers/ollama-worker.js - Passes auth token to class
- api_ollama/ollama-popup.js - Receives auth token from message
- manifest.json - Added web_accessible_resources
Email Analysis Request
↓
background.js analyzeEmailContent()
↓
Direct fetch() to http://localhost:11434/api/chat
↓ (includes Authorization header if token is set)
Ollama Server (local)
↓
Response with label
Now tries multiple approaches to parse error:
- Check if response is JSON (by content-type header)
- Fall back to text() for error pages
- Gracefully handle parse errors
- Specific message for 403: "Ollama authentication failed (403). Check your API key/token if Ollama requires authentication."
constructor({host='', model='', stream=false, num_ctx=0, authToken=''}) {
this.authToken = authToken || '';
}
getHeaders = () => {
const headers = {"Content-Type": "application/json"};
if (this.authToken) {
headers['Authorization'] = `Bearer ${this.authToken}`;
}
return headers;
}- Ensure Ollama server is running:
curl http://localhost:11434/api/tags - Check Settings → Ollama → "Test Connection" (should work)
- Try analyzing an email and check console for detailed error
- Check if Auth Token needs to be set/cleared
- Review Ollama server logs for 403 details
- ✓ manifest.json - Updated with web_accessible_resources
- ✓ background.js - Error handling + fetch calls
- ✓ js/ollama.js - Auth token support
- ✓ js/workers/ollama-worker.js - Auth token forwarding
- ✓ api_ollama/index.html - Popup UI
- ✓ api_ollama/ollama-popup.js - Popup handler with auth
Last Updated: 2026-01-16 00:44 XPI: autosortplus.xpi (58K)