Files
aoc/RELEASE_NOTES_v1.2.5.md
Tomas Kracmar 5e02f5a402
All checks were successful
CI / lint-and-test (push) Successful in 25s
docs: add v1.2.5 release notes
2026-04-20 17:12:43 +02:00

2.5 KiB

AOC v1.2.5 Release Notes

Release date: 2026-04-20


What's new

Natural language query (/api/ask)

Ask questions in plain English and get AI-generated answers backed by your audit logs.

  • Regex-based parsing extracts time ranges (last 3 days, yesterday, today) and entities (device ABC123, user bob@example.com) without calling an LLM.
  • AI narrative summarisation via any OpenAI-compatible API (OpenAI, Azure OpenAI, MS Foundry, Ollama).
  • Graceful fallback when no LLM is configured — returns a structured bullet list with a clear error banner.
  • Cited evidence — every answer includes the raw events that back it up.

Filter-aware queries

The ask endpoint now respects the filter panel. When you set Service = Exchange, Result = failure and ask "What happened to device X?", the LLM only sees failed Exchange events for that device.

Scales to thousands of events

For large result sets (>50 events), the LLM receives an aggregated overview instead of a raw dump:

  • Counts by service, action, result, and actor
  • Failure highlights
  • The 50 most recent raw events as samples

This keeps token usage low while preserving accuracy.

Azure OpenAI / MS Foundry support

  • Automatic api-key header detection for Azure endpoints.
  • LLM_API_VERSION config for Azure api-version query parameters.
  • max_completion_tokens support for newer model deployments.

Version display

  • GET /api/version endpoint reads the VERSION file.
  • Frontend shows a version badge in the header (e.g., 1.2.5).

Production hardening (from v1.1.0)

  • Dockerfile runs as non-root user with Gunicorn + Uvicorn workers.
  • docker-compose.prod.yml with internal-only MongoDB, health checks, and nginx reverse proxy.
  • Security headers (X-Frame-Options, X-Content-Type-Options, etc.).

Configuration

Add to your .env:

# Required for AI narrative summarisation
LLM_API_KEY=your-key
LLM_BASE_URL=https://api.openai.com/v1
LLM_MODEL=gpt-4o-mini
LLM_MAX_EVENTS=200
LLM_TIMEOUT_SECONDS=30
LLM_API_VERSION=                 # set for Azure OpenAI, e.g. 2024-12-01-preview

For Azure OpenAI / MS Foundry:

LLM_BASE_URL=https://your-resource.openai.azure.com/openai/deployments/your-deployment
LLM_API_KEY=your-azure-key
LLM_API_VERSION=2024-12-01-preview
LLM_MODEL=your-deployment-name

Upgrade notes

No breaking changes. Existing /api/events, filters, pagination, tags, and comments work unchanged.


Docker image

git.cqre.net/cqrenet/aoc-backend:v1.2.5