feat: raise LLM event limit to 200 and show total count awareness
- Bump LLM_MAX_EVENTS default from 50 to 200 - Add total_matched count to /api/ask response - Include 'Showing X of Y total' header in LLM prompt so the model knows when its view is a subset and avoids false certainty - Update system prompt to instruct acknowledging scale when truncated - Update test mocks to accept new total parameter
This commit is contained in:
@@ -42,6 +42,6 @@ ALERTS_ENABLED=false
|
||||
LLM_API_KEY=
|
||||
LLM_BASE_URL=https://api.openai.com/v1
|
||||
LLM_MODEL=gpt-4o-mini
|
||||
LLM_MAX_EVENTS=50
|
||||
LLM_MAX_EVENTS=200
|
||||
LLM_TIMEOUT_SECONDS=30
|
||||
LLM_API_VERSION=
|
||||
|
||||
Reference in New Issue
Block a user