2.1 KiB
2.1 KiB
AOC v1.1.0 Release Notes
Release date: 2026-04-20
What's new
Natural language query (/api/ask)
Ask questions in plain English and get AI-generated answers backed by your audit logs.
- Regex-based parsing extracts time ranges (
last 3 days,yesterday,today) and entities (device ABC123,user bob@example.com) without calling an LLM — fast and deterministic. - AI narrative summarisation via any OpenAI-compatible API (OpenAI, Azure OpenAI, MS Foundry, Ollama). The LLM reads the matching events and writes a concise story for non-expert admins.
- Graceful fallback when no LLM is configured — returns a structured bullet list instead of a narrative.
- Cited evidence — every answer includes the raw events that back it up, so admins can verify claims.
Azure OpenAI / MS Foundry support
- Automatic
api-keyheader detection for Azure endpoints. LLM_API_VERSIONconfig for Azureapi-versionquery parameters.max_completion_tokenssupport for newer model deployments.
Production hardening
- Dockerfile: runs as non-root user, uses Gunicorn + Uvicorn workers.
- docker-compose.prod.yml: MongoDB is internal-only (no host port exposure), health checks on all services, nginx reverse proxy with security headers.
- nginx config: gzip, security headers (
X-Frame-Options,X-Content-Type-Options), ready for TLS.
Frontend
- New "Ask a question" panel at the top of the page.
- Markdown rendering for LLM answers (bold, italic, code).
- Orange warning banner when LLM is not configured or fails.
Tests
- 29 new tests covering ask parsing, query building, and endpoint behaviour.
- 62 tests total, all passing.
Configuration
Add to your .env:
# Required for AI narrative summarisation
LLM_API_KEY=your-key
LLM_BASE_URL=https://api.openai.com/v1
LLM_MODEL=gpt-4o-mini
LLM_MAX_EVENTS=50
LLM_TIMEOUT_SECONDS=30
LLM_API_VERSION= # set for Azure OpenAI, e.g. 2024-12-01-preview
Upgrade notes
No breaking changes. Existing /api/events, filters, pagination, tags, and comments work unchanged.
Docker image
git.cqre.net/cqrenet/aoc-backend:v1.1.0