• v1.3.0 60b6ad15c4

    v1.3.0 — AI Feature Flag & MCP Server
    All checks were successful
    CI / lint-and-test (push) Successful in 45s
    Release / build-and-push (push) Successful in 1m34s
    Stable

    tomas.kracmar released this 2026-04-20 16:11:31 +00:00 | 48 commits to main since this release

    What's New

    AI Feature Flag (AI_FEATURES_ENABLED)

    • Gate all AI/natural-language features with a single environment variable.
    • When false, the /api/ask endpoint is completely unregistered and the "Ask a question" panel is hidden from the UI.
    • New GET /api/config/features endpoint exposes feature flags to the frontend.

    MCP Server (backend/mcp_server.py)

    • Standalone Model Context Protocol server for Claude Desktop, Cursor, and other MCP clients.
    • Connects directly to MongoDB (bypasses FastAPI auth layer — run only in trusted environments).
    • Exposes four tools:
      • search_events — filter by entity, service, operation, result, and time range.
      • get_event — retrieve raw event JSON by ID.
      • get_summary — aggregated activity summary (by service, operation, result, top actors) for the last N days.
      • ask — natural language query that returns recent matching events.

    Natural Language Query (/api/ask)

    • Ask questions like "What happened to device ABC123 in the last 3 days?"
    • Intent-aware service filtering: broad queries automatically exclude high-volume Exchange/SharePoint noise.
    • Smart sampling: when datasets are large, failures and high-value services are prioritized for LLM context.
    • Aggregated overviews for datasets with >50 events.
    • Respects active UI filters (services, actor, operation, result, tags).
    • Azure OpenAI / MS Foundry compatible (api-key header, api-version, max_completion_tokens).

    Version Endpoint

    • GET /api/version returns the running version (baked into the Docker image at build time).
    • Displayed as a badge in the UI header.

    Upgrade Notes

    1. Pull the new image: docker compose pull && docker compose up -d
    2. Optionally set AI_FEATURES_ENABLED=false in .env to disable AI features.
    3. Optionally configure LLM_API_KEY, LLM_BASE_URL, LLM_MODEL for the /api/ask endpoint.
    4. For MCP: install mcp from requirements.txt and configure your MCP client to run python backend/mcp_server.py.
    Downloads