Release v1.3.0: AI feature flag and MCP server
- Add AI_FEATURES_ENABLED config flag to gate AI/natural-language features - Conditionally register /api/ask router based on AI_FEATURES_ENABLED - Add GET /api/config/features endpoint for frontend feature detection - Update frontend to hide Ask panel when AI features are disabled - Implement standalone MCP server (backend/mcp_server.py) with tools: * search_events, get_event, get_summary, ask - Add mcp dependency to requirements.txt - Update .env.example, AGENTS.md, and ROADMAP.md - Bump VERSION to 1.3.0
This commit is contained in:
@@ -42,7 +42,8 @@ class Settings(BaseSettings):
|
||||
# Alerting
|
||||
ALERTS_ENABLED: bool = False
|
||||
|
||||
# LLM / Natural Language Query
|
||||
# AI / Natural Language Query
|
||||
AI_FEATURES_ENABLED: bool = True
|
||||
LLM_API_KEY: str = ""
|
||||
LLM_BASE_URL: str = "https://api.openai.com/v1"
|
||||
LLM_MODEL: str = "gpt-4o-mini"
|
||||
@@ -77,6 +78,7 @@ SIEM_ENABLED = _settings.SIEM_ENABLED
|
||||
SIEM_WEBHOOK_URL = _settings.SIEM_WEBHOOK_URL
|
||||
ALERTS_ENABLED = _settings.ALERTS_ENABLED
|
||||
|
||||
AI_FEATURES_ENABLED = _settings.AI_FEATURES_ENABLED
|
||||
LLM_API_KEY = _settings.LLM_API_KEY
|
||||
LLM_BASE_URL = _settings.LLM_BASE_URL
|
||||
LLM_MODEL = _settings.LLM_MODEL
|
||||
|
||||
Reference in New Issue
Block a user