feat: expose LLM error reason in /api/ask response and UI
All checks were successful
CI / lint-and-test (push) Successful in 21s
Release / build-and-push (push) Successful in 28s

- Add llm_error field to AskResponse so users know why AI summarisation was skipped
- Show orange warning banner in frontend when LLM is not configured or call fails
- Update AskEndpoint tests to assert llm_error presence
This commit is contained in:
2026-04-20 15:45:32 +02:00
parent be319688f6
commit 9ec193ea13
4 changed files with 22 additions and 1 deletions

View File

@@ -92,3 +92,4 @@ class AskResponse(BaseModel):
events: list[AskEventRef]
query_info: dict
llm_used: bool
llm_error: str | None = None