feat: expose LLM error reason in /api/ask response and UI
All checks were successful
CI / lint-and-test (push) Successful in 21s
Release / build-and-push (push) Successful in 28s

- Add llm_error field to AskResponse so users know why AI summarisation was skipped
- Show orange warning banner in frontend when LLM is not configured or call fails
- Update AskEndpoint tests to assert llm_error presence
This commit is contained in:
2026-04-20 15:45:32 +02:00
parent be319688f6
commit 9ec193ea13
4 changed files with 22 additions and 1 deletions

View File

@@ -418,6 +418,16 @@ input {
font-size: 13px;
}
.ask-error {
background: rgba(249, 115, 22, 0.1);
border: 1px solid rgba(249, 115, 22, 0.3);
border-radius: 8px;
padding: 10px 14px;
color: #fdba74;
font-size: 14px;
margin-bottom: 10px;
}
.ask-events {
margin-bottom: 14px;
}