feat: expose LLM error reason in /api/ask response and UI
- Add llm_error field to AskResponse so users know why AI summarisation was skipped - Show orange warning banner in frontend when LLM is not configured or call fails - Update AskEndpoint tests to assert llm_error presence
This commit is contained in:
@@ -92,3 +92,4 @@ class AskResponse(BaseModel):
|
||||
events: list[AskEventRef]
|
||||
query_info: dict
|
||||
llm_used: bool
|
||||
llm_error: str | None = None
|
||||
|
||||
Reference in New Issue
Block a user