-
released this
2026-04-20 16:11:31 +00:00 | 48 commits to main since this releaseWhat's New
AI Feature Flag (
AI_FEATURES_ENABLED)- Gate all AI/natural-language features with a single environment variable.
- When
false, the/api/askendpoint is completely unregistered and the "Ask a question" panel is hidden from the UI. - New
GET /api/config/featuresendpoint exposes feature flags to the frontend.
MCP Server (
backend/mcp_server.py)- Standalone Model Context Protocol server for Claude Desktop, Cursor, and other MCP clients.
- Connects directly to MongoDB (bypasses FastAPI auth layer — run only in trusted environments).
- Exposes four tools:
search_events— filter by entity, service, operation, result, and time range.get_event— retrieve raw event JSON by ID.get_summary— aggregated activity summary (by service, operation, result, top actors) for the last N days.ask— natural language query that returns recent matching events.
Natural Language Query (
/api/ask)- Ask questions like "What happened to device ABC123 in the last 3 days?"
- Intent-aware service filtering: broad queries automatically exclude high-volume Exchange/SharePoint noise.
- Smart sampling: when datasets are large, failures and high-value services are prioritized for LLM context.
- Aggregated overviews for datasets with >50 events.
- Respects active UI filters (services, actor, operation, result, tags).
- Azure OpenAI / MS Foundry compatible (
api-keyheader,api-version,max_completion_tokens).
Version Endpoint
GET /api/versionreturns the running version (baked into the Docker image at build time).- Displayed as a badge in the UI header.
Upgrade Notes
- Pull the new image:
docker compose pull && docker compose up -d - Optionally set
AI_FEATURES_ENABLED=falsein.envto disable AI features. - Optionally configure
LLM_API_KEY,LLM_BASE_URL,LLM_MODELfor the/api/askendpoint. - For MCP: install
mcpfromrequirements.txtand configure your MCP client to runpython backend/mcp_server.py.
Downloads