feat: expose LLM error reason in /api/ask response and UI
- Add llm_error field to AskResponse so users know why AI summarisation was skipped - Show orange warning banner in frontend when LLM is not configured or call fails - Update AskEndpoint tests to assert llm_error presence
This commit is contained in:
@@ -418,6 +418,16 @@ input {
|
||||
font-size: 13px;
|
||||
}
|
||||
|
||||
.ask-error {
|
||||
background: rgba(249, 115, 22, 0.1);
|
||||
border: 1px solid rgba(249, 115, 22, 0.3);
|
||||
border-radius: 8px;
|
||||
padding: 10px 14px;
|
||||
color: #fdba74;
|
||||
font-size: 14px;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.ask-events {
|
||||
margin-bottom: 14px;
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user