30 Commits

Author SHA1 Message Date
7cd7709b4a fix: dedupe alert_rules before creating unique index in setup_indexes()
All checks were successful
CI / lint-and-test (push) Successful in 1m7s
Release / build-and-push (push) Successful in 2m25s
The unique index on alert_rules.name was being created before duplicates
were cleaned up, causing DuplicateKeyError on startup when existing
duplicates were present. Move deduplication into setup_indexes() so it
runs before the unique index is created.

v1.7.6
2026-04-22 15:20:19 +02:00
9cd50d1257 chore: bump version to 1.7.5
All checks were successful
CI / lint-and-test (push) Successful in 30s
Release / build-and-push (push) Successful in 1m29s
2026-04-22 15:13:55 +02:00
646d61f72e fix: dedupe existing rules + unique index to prevent duplicates
- Add unique index on alert_rules.name in setup_indexes()
- seed_default_rules() now removes duplicates by name before upserting
- Keeps the oldest document (_id ascending) when deduping
2026-04-22 15:13:41 +02:00
5f7a98f21c chore: bump version to 1.7.4
All checks were successful
CI / lint-and-test (push) Successful in 28s
Release / build-and-push (push) Successful in 1m30s
2026-04-22 14:57:06 +02:00
19ed231a31 fix: prevent duplicate default rules on multi-worker startup
- Replace insert_many with replace_one(..., upsert=True) keyed by rule name
- Safe for concurrent startup with multiple gunicorn workers
2026-04-22 14:56:53 +02:00
f812fda150 chore: bump version to 1.7.3
All checks were successful
CI / lint-and-test (push) Successful in 44s
Release / build-and-push (push) Successful in 1m40s
2026-04-22 14:48:17 +02:00
a194c78c59 feat: all panels are now collapsible
- Source Health, Alerts, Alert Rules, Filters, Ask, Events panels all collapsible
- Click panel header to expand/collapse
- Chevron indicator rotates to show state
- Collapsed state persisted to localStorage (aoc_panels key)
2026-04-22 14:48:03 +02:00
e984899d4c chore: bump version to 1.7.2
All checks were successful
Release / build-and-push (push) Successful in 1m39s
CI / lint-and-test (push) Successful in 43s
2026-04-22 14:43:13 +02:00
b618cb29ea feat: alert rules management UI
- Add Alert Rules panel between Alerts and Filters sections
- List all rules with severity badge, on/off toggle, conditions preview
- Add Rule button opens modal with form for name, severity, message, conditions
- Edit existing rules inline
- Delete rules with confirmation
- Condition builder supports eq, neq, contains, in, after_hours operators
2026-04-22 14:42:58 +02:00
3e1416cd52 chore: bump version to 1.7.1
All checks were successful
CI / lint-and-test (push) Successful in 31s
Release / build-and-push (push) Successful in 1m32s
2026-04-22 14:21:46 +02:00
94983c43e9 fix: alert panel always visible, version display normalization
- Remove x-show condition hiding alert panel when no alerts exist
- Add empty state message explaining alerts appear on rule triggers
- Normalize appVersion in loadVersion() to strip leading 'v' (prevents vv1.7.0 in footer)
2026-04-22 14:21:34 +02:00
0a16cf6870 chore: bump version to 1.7.0
All checks were successful
CI / lint-and-test (push) Successful in 26s
Release / build-and-push (push) Successful in 1m15s
2026-04-22 14:12:49 +02:00
e348881083 feat: Admin Operations SIEM — alerts, notifications, pre-built rules
- Add pluggable notification system (webhook, Slack, Teams) with retry
- Add alert deduplication: same rule + actor within 15 min = one alert
- Add 10 pre-built admin-ops rule templates seeded on startup:
  - Failed Conditional Access, After-Hours Admin Activity
  - New Application Registration, Admin Role Assignment
  - License Change, Bulk User Deletion
  - Device Compliance Failure, Exchange Transport Rule Change
  - Service Principal Credential Added, External Sharing Enabled
- Add /api/alerts, /api/alerts/{id}/status, /api/alerts/summary endpoints
- Add alert dashboard to frontend with status filters and ack/resolve buttons
- Add alert summary badge in hero header (high/medium/low counts)
- New env vars: ALERT_WEBHOOK_URL, ALERT_WEBHOOK_FORMAT, ALERT_DEDUPE_MINUTES
2026-04-22 14:12:36 +02:00
a220494bcf docs: add Phase 6 multi-tenancy plan to roadmap
All checks were successful
CI / lint-and-test (push) Successful in 43s
- Row-level isolation architecture
- Per-tenant Entra + Graph credentials
- License-gated premium feature
- Deferred until SIEM export and alerting are production-tested
2026-04-22 13:49:56 +02:00
5bda1dd616 chore: bump version to 1.6.4
All checks were successful
CI / lint-and-test (push) Successful in 25s
Release / build-and-push (push) Successful in 1m29s
2026-04-22 12:16:32 +02:00
3e333291c6 fix: revert to single-click service filter, show all services by default, page size 24
- Revert +/- buttons on service pills back to single-click = filter only this service
- Remove default exclusion of Exchange/SharePoint/Teams (privacy controls handle this server-side)
- Change default page size from 25 to 24 (divisible by 3 for the 3-column grid)
- Update DEFAULT_PAGE_SIZE config default to 24
2026-04-22 12:16:20 +02:00
aa62528862 chore: bump version to 1.6.3
All checks were successful
CI / lint-and-test (push) Successful in 35s
Release / build-and-push (push) Successful in 1m47s
2026-04-22 12:02:28 +02:00
ac155d8843 feat: +/- buttons on service pills for additive/subtractive filtering
- Replace single-click service pill filter with explicit +/− buttons
- '+' adds the service to the current filter (keeps other selections)
- '−' removes the service from the current filter
- Result pills keep toggle click behavior
- Add .pill__action styles for small inline buttons
2026-04-22 12:02:11 +02:00
ed7465f5cd chore: bump version to 1.6.2
All checks were successful
Release / build-and-push (push) Successful in 1m33s
CI / lint-and-test (push) Successful in 33s
2026-04-22 11:53:21 +02:00
0eebcd0765 feat: clickable pills, configurable page size, CQRE.NET branding
- Service/category pills are now clickable: click to filter by that service
- Result pills (Success, Failure, etc.) are now clickable: click to filter by that result
- Click again to clear the filter (toggle behavior)
- Change default page size from 100 to 25
- Add DEFAULT_PAGE_SIZE config (env var, default 25), exposed via /api/config/features
- Change footer brand from CQRE to CQRE.NET
- Add pill--clickable hover styles
- Bump CSS cache-buster to v=10
2026-04-22 11:53:01 +02:00
67f3c28e82 chore: bump version to 1.6.1
All checks were successful
CI / lint-and-test (push) Successful in 32s
Release / build-and-push (push) Successful in 1m30s
2026-04-22 11:31:57 +02:00
04c41ee740 style: UI polish — topbar, footer, user info, product feel
- Add sticky top navigation bar with brand, repo/docs links, user chip
- Show logged-in user name + email from MSAL account
- Add footer with version, issue link, repo link, docs link
- Move action buttons (Fetch/Refresh/Login) to compact topbar
- Clean up hero section (removed buttons, just title + tagline)
- Bump CSS cache-buster to v=9
- Responsive stacking for mobile
2026-04-22 11:31:37 +02:00
cbd46adaa6 style: ruff format
All checks were successful
CI / lint-and-test (push) Successful in 25s
2026-04-22 10:08:32 +02:00
e4bafbc4b0 chore: fix ruff import order in test_ask.py
Some checks failed
CI / lint-and-test (push) Failing after 19s
2026-04-22 10:06:07 +02:00
f75f165911 feat: Redis caching + async queue for LLM scaling (v1.6.0)
Some checks failed
Release / build-and-push (push) Successful in 1m24s
CI / lint-and-test (push) Failing after 29s
- Add async Redis client singleton (redis_client.py) for caching and arq pool
- Add arq job functions (jobs.py) for background LLM processing
- Cache ask/explain LLM responses with TTL (1h ask, 24h explain)
- Add async mode to /api/ask: enqueue job, return job_id, poll /api/jobs/{id}
- Add GET /api/jobs/{job_id} endpoint for job status polling
- Add arq worker service to docker-compose (dev + prod)
- Switch from Redis to Valkey (BSD fork) in Docker Compose
- Add REDIS_URL config setting
- Add tests for cache hit, async mode, and job status
2026-04-22 09:55:05 +02:00
47e0dfc2ca chore: bump version to 1.5.0
All checks were successful
CI / lint-and-test (push) Successful in 37s
Release / build-and-push (push) Successful in 1m51s
2026-04-22 08:30:20 +02:00
2fffe3aec2 feat: operation-level privacy gating instead of broad service-level
All checks were successful
CI / lint-and-test (push) Successful in 21s
- Replace broad service-level hiding with fine-grained operation-level gating
- PRIVACY_SENSITIVE_OPERATIONS config: hide specific operations across ALL services
- PRIVACY_SERVICES still works for broad service-level blocking (optional)
- Users without PRIVACY_SERVICE_ROLES:
  * Don't see sensitive operations in /api/filter-options
  * Can't query sensitive operations via /api/events or /api/ask
  * Get 403 on /api/events/{id}/explain for sensitive events
- Exchange/Teams services remain visible; only privacy ops are hidden
- Update .env.example with new operation-level config docs
2026-04-22 08:23:46 +02:00
b2f4cabef4 feat: service-level role gating for privacy-sensitive services (Option A)
All checks were successful
CI / lint-and-test (push) Successful in 25s
- Add PRIVACY_SERVICES and PRIVACY_SERVICE_ROLES config variables
- Add user_can_access_privacy_services(claims) helper in auth.py
- /api/events filters out privacy services for users without required roles
- /api/filter-options excludes privacy services from dropdown options
- /api/ask excludes privacy services from NLQ queries
- /api/events/{id}/explain returns 403 for privacy events if unauthorized
- Teams added to default noisy service exclusion (frontend + backend)
- Update .env.example with privacy config documentation
- Add tests for event filtering, filter-options exclusion, and explain 403
2026-04-22 07:26:21 +02:00
e069869a94 feat: exclude Teams from defaults + GUID resolution in explain
All checks were successful
CI / lint-and-test (push) Successful in 26s
- Add Teams to noisy services excluded by default (frontend + backend ask)
- Exchange, SharePoint, and Teams now unchecked by default in filters
- Enhance explain endpoint with GUID resolution:
  * Extract UUIDs from raw event JSON recursively
  * Resolve directory objects via Graph API (user, group, SP, device)
  * Include resolved names in LLM prompt so explanations reference
    human-readable names instead of raw GUIDs
- Add asyncio import for to_thread wrapper around sync Graph calls
2026-04-22 07:12:10 +02:00
fb2386e190 feat: saved searches (bookmarks)
All checks were successful
CI / lint-and-test (push) Successful in 23s
- Add saved_searches_collection to database.py with index on created_by+created_at
- New routes/saved_searches.py: GET /api/saved-searches, POST, DELETE
- Saved searches are scoped per user (created_by = token sub)
- Mount router in main.py
- Frontend: Save filters button, saved search pills with load/delete
- loadSavedSearches called on initApp
- applySavedSearch restores filters and validates services against current options
- Add CSS for saved-searches row
- Add tests for CRUD, delete 404, and name validation
2026-04-22 07:04:07 +02:00
27 changed files with 2362 additions and 47 deletions

View File

@@ -49,3 +49,25 @@ LLM_MODEL=gpt-4o-mini
LLM_MAX_EVENTS=200
LLM_TIMEOUT_SECONDS=30
LLM_API_VERSION=
# Valkey (caching + async job queue for LLM calls)
# In Docker Compose, this is set automatically to redis://redis:6379/0
# For local dev, start Valkey with: docker run -d -p 6379:6379 valkey/valkey:8-alpine
REDIS_URL=redis://localhost:6379/0
# UI default page size (number of events shown per page)
DEFAULT_PAGE_SIZE=24
# Alert notifications (optional)
# Send triggered admin-ops alerts to a webhook (Slack, Teams, or generic)
ALERT_WEBHOOK_URL=
ALERT_WEBHOOK_FORMAT=generic # generic | slack | teams
ALERT_DEDUPE_MINUTES=15
# Optional: privacy / access control
# Hide entire services from users without PRIVACY_SERVICE_ROLES
# PRIVACY_SERVICES=Exchange,Teams
# Hide specific operations across all services from users without PRIVACY_SERVICE_ROLES
# PRIVACY_SENSITIVE_OPERATIONS=MailItemsAccessed,Search-Mailbox,Send,ChatMessageRead
# Comma-separated list of Entra roles that can access privacy-sensitive data
# PRIVACY_SERVICE_ROLES=SecurityAdministrator,ComplianceAdministrator

View File

@@ -65,9 +65,39 @@ Goal: add AI-powered analysis and external tool integration.
- [x] AI feature flag (`AI_FEATURES_ENABLED`) to gate LLM-dependent features
- [x] Natural language query endpoint (`/api/ask`) with intent extraction and smart sampling
- [x] MCP (Model Context Protocol) server for Claude Desktop / Cursor integration
- [x] Valkey caching for LLM responses and frequent queries
- [x] Async queue (arq) for LLM requests to prevent timeout/cost explosions at scale
- [ ] Advanced analytics dashboard (trending operations, anomaly detection)
- [ ] Redis caching for LLM responses and frequent queries
- [ ] Async queue for LLM requests to prevent timeout/cost explosions at scale
## Completed in this PR
All Phase 5 items marked done were implemented in v1.3.0.
All Phase 5 items marked done were implemented in v1.3.0v1.5.0.
Redis caching + async queue implemented in v1.6.0, switched to Valkey.
UI polish (topbar, footer, clickable pills) in v1.6.1v1.6.4.
---
## Phase 6: Multi-Tenancy (Premium) ⏸️
Goal: allow MSPs to manage multiple client tenants from a single deployment.
Status: **Planned — not started**. Architecture designed, pending validation of core features (SIEM export, alerting) in production first.
### Architecture
- Row-level isolation: `tenant_id` field on every MongoDB document
- Each tenant has their own Microsoft Entra tenant + app registration credentials
- Auth: user's JWT `tid` claim maps to tenant config automatically
- Super-admin role for MSP staff to access all tenants
### Implementation phases
- **Phase 6.1** (23 days): Tenant model & registry, tenant-aware data layer, per-tenant Graph API auth
- **Phase 6.2** (1 day): Tenant-scoped API routes, tenant-specific config endpoints
- **Phase 6.3** (2 days): Frontend tenant switcher, tenant name display, admin page
- **Phase 6.4** (1 day): License gating — signed JWT `LICENSE_KEY` gates multi-tenant mode
### Licensing model
- Single-tenant: remains MIT/free
- Multi-tenant: premium feature requiring a signed license key
- License key is a JWT with claims: `plan`, `max_tenants`, `exp`, `features`
- Offline license generation tool included
### Effort estimate
~79 days total. Deferred until SIEM export and alerting are battle-tested.

View File

@@ -1 +1 @@
1.4.0
1.7.6

View File

@@ -8,6 +8,8 @@ from config import (
AUTH_CLIENT_ID,
AUTH_ENABLED,
AUTH_TENANT_ID,
PRIVACY_SERVICE_ROLES,
PRIVACY_SERVICES,
)
from fastapi import Header, HTTPException
from jwt import ExpiredSignatureError, InvalidTokenError, decode
@@ -82,6 +84,14 @@ def _decode_token(token: str, jwks):
raise HTTPException(status_code=401, detail=f"Invalid token ({type(exc).__name__})") from None
def user_can_access_privacy_services(claims: dict) -> bool:
"""Check if the user has roles that grant access to privacy-sensitive services."""
if not PRIVACY_SERVICES or not PRIVACY_SERVICE_ROLES:
return True
user_roles = set(claims.get("roles", []) or claims.get("role", []) or [])
return bool(user_roles.intersection(PRIVACY_SERVICE_ROLES))
def require_auth(authorization: str | None = Header(None)):
if not AUTH_ENABLED:
return {"sub": "anonymous"}

View File

@@ -51,6 +51,23 @@ class Settings(BaseSettings):
LLM_TIMEOUT_SECONDS: int = 30
LLM_API_VERSION: str = "" # e.g. 2025-01-01-preview for Azure OpenAI
# Privacy / access control
# Entire services can be hidden, or specific operations can be gated.
PRIVACY_SERVICES: str = "" # comma-separated, e.g. "Exchange,Teams"
PRIVACY_SENSITIVE_OPERATIONS: str = "" # comma-separated, e.g. "MailItemsAccessed,Search-Mailbox,Send"
PRIVACY_SERVICE_ROLES: str = "" # comma-separated, e.g. "SecurityAdministrator,ComplianceAdministrator"
# Redis (caching + async job queue)
REDIS_URL: str = "redis://localhost:6379/0"
# UI defaults
DEFAULT_PAGE_SIZE: int = 24
# Alert notifications
ALERT_WEBHOOK_URL: str = ""
ALERT_WEBHOOK_FORMAT: str = "generic" # generic | slack | teams
ALERT_DEDUPE_MINUTES: int = 15
_settings = Settings()
@@ -85,3 +102,14 @@ LLM_MODEL = _settings.LLM_MODEL
LLM_MAX_EVENTS = _settings.LLM_MAX_EVENTS
LLM_TIMEOUT_SECONDS = _settings.LLM_TIMEOUT_SECONDS
LLM_API_VERSION = _settings.LLM_API_VERSION
PRIVACY_SERVICES = {s.strip() for s in _settings.PRIVACY_SERVICES.split(",") if s.strip()}
PRIVACY_SENSITIVE_OPERATIONS = {o.strip() for o in _settings.PRIVACY_SENSITIVE_OPERATIONS.split(",") if o.strip()}
PRIVACY_SERVICE_ROLES = {r.strip() for r in _settings.PRIVACY_SERVICE_ROLES.split(",") if r.strip()}
REDIS_URL = _settings.REDIS_URL
DEFAULT_PAGE_SIZE = _settings.DEFAULT_PAGE_SIZE
ALERT_WEBHOOK_URL = _settings.ALERT_WEBHOOK_URL
ALERT_WEBHOOK_FORMAT = _settings.ALERT_WEBHOOK_FORMAT
ALERT_DEDUPE_MINUTES = _settings.ALERT_DEDUPE_MINUTES

View File

@@ -7,9 +7,25 @@ from pymongo import ASCENDING, DESCENDING, TEXT, MongoClient
client = MongoClient(MONGO_URI or "mongodb://localhost:27017")
db = client[DB_NAME]
events_collection = db["events"]
saved_searches_collection = db["saved_searches"]
alerts_collection = db["alerts"]
logger = structlog.get_logger("aoc.database")
def _dedupe_alert_rules():
"""Remove duplicate alert_rules by name, keeping the oldest document."""
try:
pipeline = [
{"$sort": {"_id": ASCENDING}},
{"$group": {"_id": "$name", "first_id": {"$first": "$_id"}}},
]
seen = {doc["_id"]: doc["first_id"] for doc in db["alert_rules"].aggregate(pipeline)}
for name, keep_id in seen.items():
db["alert_rules"].delete_many({"name": name, "_id": {"$ne": keep_id}})
except Exception:
pass # Collection may not exist yet
def setup_indexes(max_retries: int = 5, delay: float = 2.0):
"""Ensure MongoDB indexes exist. Retries on connection errors."""
from time import sleep
@@ -20,6 +36,9 @@ def setup_indexes(max_retries: int = 5, delay: float = 2.0):
events_collection.create_index([("timestamp", DESCENDING)])
events_collection.create_index([("service", ASCENDING), ("timestamp", DESCENDING)])
events_collection.create_index("id")
saved_searches_collection.create_index([("created_by", ASCENDING), ("created_at", DESCENDING)])
_dedupe_alert_rules()
db["alert_rules"].create_index("name", unique=True)
events_collection.create_index(
[("actor_display", TEXT), ("raw_text", TEXT), ("operation", TEXT)],
name="text_search_index",

View File

@@ -4,28 +4,63 @@
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Admin Operations Center</title>
<link rel="stylesheet" href="/style.css?v=8" />
<link rel="stylesheet" href="/style.css?v=15" />
<script defer src="https://cdn.jsdelivr.net/npm/alpinejs@3.x.x/dist/cdn.min.js"></script>
<script src="https://alcdn.msauth.net/browser/2.37.0/js/msal-browser.min.js" crossorigin="anonymous"></script>
</head>
<body>
<div class="page" x-data="aocApp()" x-init="initApp()">
<nav class="topbar">
<div class="topbar__brand">
<span class="topbar__logo">🔍</span>
<span class="topbar__name">AOC</span>
<span class="version-badge" x-text="appVersion"></span>
</div>
<div class="topbar__links">
<a :href="repoUrl" target="_blank" rel="noopener">Repository</a>
<a :href="docsUrl" target="_blank" rel="noopener">Docs</a>
</div>
<div class="topbar__meta">
<template x-if="account">
<div class="user-chip">
<div class="user-avatar" x-text="(account.name || account.username || '?').charAt(0).toUpperCase()"></div>
<div class="user-details">
<span class="user-name" x-text="account.name || account.username || ''"></span>
<span class="user-email" x-text="account.username || ''"></span>
</div>
</div>
</template>
<template x-if="!account && authConfig?.auth_enabled">
<span class="login-hint">Not signed in</span>
</template>
</div>
<div class="topbar__actions">
<button id="fetchBtn" class="ghost btn--compact" aria-label="Fetch latest audit logs" @click="fetchLogs()">Fetch</button>
<button id="refreshBtn" class="ghost btn--compact" aria-label="Refresh events" @click="loadEvents(currentCursor)">Refresh</button>
<button id="authBtn" class="ghost btn--compact" aria-label="Login" x-text="authBtnText" @click="toggleAuth()"></button>
</div>
</nav>
<header class="hero">
<div>
<p class="eyebrow">Admin Operations Center <span class="version-badge" x-text="appVersion"></span></p>
<p class="eyebrow">Admin Operations Center</p>
<h1>Audit Log Explorer</h1>
<p class="lede">Search and review Microsoft audit events from Entra, Intune, Exchange, SharePoint, and Teams.</p>
</div>
<div class="cta">
<button id="authBtn" class="ghost" aria-label="Login" x-text="authBtnText" @click="toggleAuth()"></button>
<button id="fetchBtn" aria-label="Fetch latest audit logs" @click="fetchLogs()">Fetch new</button>
<button id="refreshBtn" aria-label="Refresh events" @click="loadEvents(currentCursor)">Refresh</button>
<div class="alert-summary" x-show="alertSummary.total_open > 0">
<div class="alert-badge alert-badge--high" x-show="alertSummary.high > 0" x-text="alertSummary.high"></div>
<div class="alert-badge alert-badge--medium" x-show="alertSummary.medium > 0" x-text="alertSummary.medium"></div>
<div class="alert-badge alert-badge--low" x-show="alertSummary.low > 0" x-text="alertSummary.low"></div>
<span class="alert-label">open alerts</span>
</div>
</header>
<section class="panel">
<h3>Source Health</h3>
<div class="source-health">
<div class="panel-header panel-header--collapsible" @click="togglePanel('sourceHealth')">
<h3>Source Health</h3>
<span class="panel-toggle" :class="panelState.sourceHealth ? 'panel-toggle--open' : ''"></span>
</div>
<div x-show="panelState.sourceHealth">
<template x-for="src in sourceHealth" :key="src.source">
<div class="health-card">
<strong x-text="src.source"></strong>
@@ -39,7 +74,160 @@
</section>
<section class="panel">
<form id="filters" class="filters" @submit.prevent="resetPagination(); loadEvents()">
<div class="panel-header panel-header--collapsible" @click="togglePanel('alerts')">
<h3>Alerts</h3>
<div style="display:flex;align-items:center;gap:10px;">
<span x-text="`${alertSummary.total_open} open`" class="alert-open-count"></span>
<span class="panel-toggle" :class="panelState.alerts ? 'panel-toggle--open' : ''"></span>
</div>
</div>
<div x-show="panelState.alerts">
<div class="alert-filters">
<select x-model="alertsFilter.status" @change="alertsPage = 1; loadAlerts()">
<option value="">All statuses</option>
<option value="open">Open</option>
<option value="acknowledged">Acknowledged</option>
<option value="resolved">Resolved</option>
<option value="false_positive">False Positive</option>
</select>
<select x-model="alertsFilter.severity" @change="alertsPage = 1; loadAlerts()">
<option value="">All severities</option>
<option value="high">High</option>
<option value="medium">Medium</option>
<option value="low">Low</option>
</select>
</div>
<div class="alerts-list" x-show="alerts.length > 0">
<template x-for="alert in alerts" :key="alert._id || alert.event_id">
<div class="alert-card" :class="'alert-card--' + alert.severity">
<div class="alert-card__meta">
<span class="pill" :class="alert.severity === 'high' ? 'pill--err' : (alert.severity === 'medium' ? 'pill--warn' : '')" x-text="alert.severity"></span>
<span class="pill" x-text="alert.status"></span>
<small x-text="new Date(alert.timestamp).toLocaleString()"></small>
</div>
<strong x-text="alert.rule_name"></strong>
<p x-text="alert.message"></p>
<div class="alert-card__actions">
<button type="button" class="ghost btn--compact" @click="updateAlertStatus(alert._id, 'acknowledged')" x-show="alert.status === 'open'">Acknowledge</button>
<button type="button" class="ghost btn--compact" @click="updateAlertStatus(alert._id, 'resolved')" x-show="alert.status !== 'resolved' && alert.status !== 'false_positive'">Resolve</button>
<button type="button" class="ghost btn--compact" @click="updateAlertStatus(alert._id, 'false_positive')" x-show="alert.status !== 'false_positive'">False Positive</button>
<button type="button" class="ghost btn--compact" @click="updateAlertStatus(alert._id, 'open')" x-show="alert.status !== 'open'">Reopen</button>
</div>
</div>
</template>
</div>
<div class="alerts-empty" x-show="alerts.length === 0">
<p>No alerts match the current filters. Alerts appear here when rules trigger during event ingestion.</p>
</div>
<div class="pagination" x-show="alertsTotal > 20">
<button type="button" :disabled="alertsPage === 1" @click="alertsPage--; loadAlerts()">Prev</button>
<span x-text="`Page ${alertsPage}`"></span>
<button type="button" :disabled="alertsPage * 20 >= alertsTotal" @click="alertsPage++; loadAlerts()">Next</button>
</div>
</div>
</section>
<section class="panel">
<div class="panel-header panel-header--collapsible" @click="togglePanel('rules')">
<h3>Alert Rules</h3>
<div style="display:flex;align-items:center;gap:10px;">
<button type="button" class="btn--compact" @click.stop="openRuleEditor()">+ Add rule</button>
<span class="panel-toggle" :class="panelState.rules ? 'panel-toggle--open' : ''"></span>
</div>
</div>
<div x-show="panelState.rules">
<div class="rules-list">
<template x-for="rule in rules" :key="rule.id">
<div class="rule-card" :class="rule.enabled ? '' : 'rule-card--disabled'">
<div class="rule-card__meta">
<span class="pill" :class="rule.severity === 'high' ? 'pill--err' : (rule.severity === 'medium' ? 'pill--warn' : '')" x-text="rule.severity"></span>
<label class="toggle-label">
<input type="checkbox" :checked="rule.enabled" @change="toggleRule(rule.id, $event.target.checked)">
<span x-text="rule.enabled ? 'On' : 'Off'"></span>
</label>
</div>
<strong x-text="rule.name"></strong>
<p x-text="rule.message"></p>
<div class="rule-card__conditions">
<template x-for="(cond, idx) in rule.conditions" :key="idx">
<span class="pill pill--tag" x-text="`${cond.field} ${cond.op} ${cond.value}`"></span>
</template>
</div>
<div class="rule-card__actions">
<button type="button" class="ghost btn--compact" @click="openRuleEditor(rule)">Edit</button>
<button type="button" class="ghost btn--compact" @click="deleteRule(rule.id)">Delete</button>
</div>
</div>
</template>
</div>
<div class="rules-empty" x-show="rules.length === 0">
<p>No custom rules yet. Pre-built admin-ops rules are active by default. Add your own rules to detect specific patterns.</p>
</div>
</div>
<div id="ruleModal" class="modal hidden" role="dialog" aria-modal="true" :class="{ 'hidden': !ruleModalOpen }">
<div class="modal__content" style="max-width: 600px;">
<div class="modal__header">
<h3 x-text="ruleEditId ? 'Edit Rule' : 'New Rule'"></h3>
<button type="button" class="ghost" @click="ruleModalOpen = false">Close</button>
</div>
<form class="rule-form" @submit.prevent="saveRule()">
<label>
Name
<input type="text" x-model="ruleEdit.name" placeholder="e.g. Failed CA Policy" required />
</label>
<label>
Severity
<select x-model="ruleEdit.severity">
<option value="low">Low</option>
<option value="medium">Medium</option>
<option value="high">High</option>
</select>
</label>
<label>
Message
<textarea x-model="ruleEdit.message" placeholder="What should the alert say?" rows="2"></textarea>
</label>
<div class="rule-conditions">
<span>Conditions (all must match)</span>
<template x-for="(cond, idx) in ruleEdit.conditions" :key="idx">
<div class="condition-row">
<input type="text" x-model="cond.field" placeholder="field" list="ruleFieldOptions" required />
<select x-model="cond.op">
<option value="eq">equals</option>
<option value="neq">not equals</option>
<option value="contains">contains</option>
<option value="in">in list</option>
<option value="after_hours">after hours</option>
</select>
<input type="text" x-model="cond.value" placeholder="value" :required="cond.op !== 'after_hours'" />
<button type="button" class="ghost btn--compact" @click="ruleEdit.conditions.splice(idx, 1)"></button>
</div>
</template>
<button type="button" class="ghost btn--compact" @click="ruleEdit.conditions.push({field:'', op:'eq', value:''})">+ Add condition</button>
</div>
<datalist id="ruleFieldOptions">
<option value="service"></option>
<option value="operation"></option>
<option value="result"></option>
<option value="actor_display"></option>
<option value="timestamp"></option>
</datalist>
<div class="rule-form__actions">
<button type="submit">Save</button>
<button type="button" class="ghost" @click="ruleModalOpen = false">Cancel</button>
</div>
</form>
</div>
</div>
</section>
<section class="panel">
<div class="panel-header panel-header--collapsible" @click="togglePanel('filters')">
<h3>Filters</h3>
<span class="panel-toggle" :class="panelState.filters ? 'panel-toggle--open' : ''"></span>
</div>
<form id="filters" class="filters" @submit.prevent="resetPagination(); loadEvents()" x-show="panelState.filters">
<div class="filter-row">
<label>
User (name/UPN)
@@ -112,17 +300,32 @@
<div class="actions">
<button type="submit">Apply filters</button>
<button type="button" id="clearBtn" class="ghost" @click="clearFilters()">Clear</button>
<button type="button" class="ghost" @click="saveCurrentFilters()">Save filters</button>
<button type="button" class="ghost" @click="bulkTagMatching()">Bulk tag matching</button>
<button type="button" class="ghost" @click="exportJSON()">Export JSON</button>
<button type="button" class="ghost" @click="exportCSV()">Export CSV</button>
</div>
</div>
<div class="filter-row" x-show="savedSearches.length">
<div class="saved-searches">
<span>Saved:</span>
<template x-for="ss in savedSearches" :key="ss.id">
<span class="pill pill--tag" style="cursor:pointer;" @click="applySavedSearch(ss)">
<span x-text="ss.name"></span>
<button type="button" class="link" style="margin-left:4px;" @click.stop="deleteSavedSearch(ss.id)">×</button>
</span>
</template>
</div>
</div>
</form>
</section>
<section class="panel" x-show="aiFeaturesEnabled">
<h3>Ask a question</h3>
<form class="ask-form" @submit.prevent="askQuestion()">
<div class="panel-header panel-header--collapsible" @click="togglePanel('ask')">
<h3>Ask a question</h3>
<span class="panel-toggle" :class="panelState.ask ? 'panel-toggle--open' : ''"></span>
</div>
<form class="ask-form" @submit.prevent="askQuestion()" x-show="panelState.ask">
<div class="ask-row">
<input
type="text"
@@ -146,8 +349,8 @@
<template x-for="(evt, idx) in askEvents" :key="evt.id || idx">
<article class="event event--compact">
<div class="event__meta">
<span class="pill" x-text="evt.display_category || evt.service || '—'"></span>
<span class="pill" :class="['success','succeeded','ok','passed','true'].includes((evt.result || '').toLowerCase()) ? 'pill--ok' : 'pill--warn'" x-text="evt.result || '—'"></span>
<span class="pill pill--clickable" x-text="evt.display_category || evt.service || '—'" @click="filterByService(evt.service || evt.display_category)" title="Filter by this service"></span>
<span class="pill pill--clickable" :class="['success','succeeded','ok','passed','true'].includes((evt.result || '').toLowerCase()) ? 'pill--ok' : 'pill--warn'" x-text="evt.result || '—'" @click="filterByResult(evt.result)" title="Filter by this result"></span>
</div>
<h3 x-text="evt.operation || '—'"></h3>
<p class="event__detail" x-show="evt.display_summary"><strong>Summary:</strong> <span x-text="evt.display_summary"></span></p>
@@ -164,17 +367,21 @@
</section>
<section class="panel">
<div class="panel-header">
<div class="panel-header panel-header--collapsible" @click="togglePanel('events')">
<h2>Events</h2>
<span id="count" x-text="countText"></span>
<div style="display:flex;align-items:center;gap:10px;">
<span id="count" x-text="countText"></span>
<span class="panel-toggle" :class="panelState.events ? 'panel-toggle--open' : ''"></span>
</div>
</div>
<div id="status" class="status" aria-live="polite" x-text="statusText"></div>
<div x-show="panelState.events">
<div id="status" class="status" aria-live="polite" x-text="statusText"></div>
<div id="events" class="events">
<template x-for="(evt, idx) in events" :key="evt._id || evt.id || idx">
<article class="event">
<div class="event__meta">
<span class="pill" x-text="evt.display_category || evt.service || '—'"></span>
<span class="pill" :class="['success','succeeded','ok','passed','true'].includes((evt.result || '').toLowerCase()) ? 'pill--ok' : 'pill--warn'" x-text="evt.result || '—'"></span>
<span class="pill pill--clickable" x-text="evt.display_category || evt.service || '—'" @click="filterByService(evt.service || evt.display_category)" title="Filter by this service"></span>
<span class="pill pill--clickable" :class="['success','succeeded','ok','passed','true'].includes((evt.result || '').toLowerCase()) ? 'pill--ok' : 'pill--warn'" x-text="evt.result || '—'" @click="filterByResult(evt.result)" title="Filter by this result"></span>
</div>
<h3 x-text="evt.operation || '—'"></h3>
<p class="event__detail" x-show="evt.display_summary"><strong>Summary:</strong> <span x-text="evt.display_summary"></span></p>
@@ -208,6 +415,7 @@
<span x-text="`Page ${cursorStack.length + 1}`"></span>
<button type="button" id="nextPage" :disabled="!nextCursor" @click="goNext()">Next</button>
</div>
</div>
</section>
<div id="modal" class="modal hidden" role="dialog" aria-modal="true" aria-labelledby="modalTitle" :class="{ 'hidden': !modalOpen }">
@@ -227,6 +435,21 @@
<pre id="modalBody" x-text="modalBody"></pre>
</div>
</div>
<footer class="footer">
<div class="footer__left">
<span class="footer__brand">Admin Operations Center</span>
<span class="footer__version" x-text="'v' + appVersion"></span>
</div>
<div class="footer__center">
<a :href="repoUrl + '/issues/new'" target="_blank" rel="noopener">🐛 Report an issue</a>
<a :href="repoUrl" target="_blank" rel="noopener">💻 Source code</a>
<a :href="docsUrl" target="_blank" rel="noopener">📖 Documentation</a>
</div>
<div class="footer__right">
<span>Built with ❤️ by CQRE.NET</span>
</div>
</footer>
</div>
<script>
@@ -252,11 +475,24 @@
accessToken: null,
authScopes: [],
filters: {
actor: '', selectedServices: [], search: '', operation: '', result: '', start: '', end: '', limit: 100, includeTags: '', excludeTags: '',
actor: '', selectedServices: [], search: '', operation: '', result: '', start: '', end: '', limit: 24, includeTags: '', excludeTags: '',
},
panelState: { sourceHealth: true, alerts: true, rules: true, filters: true, ask: true, events: true },
options: { actors: [], services: [], operations: [], results: [] },
savedSearches: [],
appVersion: '',
repoUrl: 'https://git.cqre.net/cqrenet/aoc',
docsUrl: 'https://git.cqre.net/cqrenet/aoc/src/branch/main/README.md',
aiFeaturesEnabled: true,
alertSummary: { total_open: 0, high: 0, medium: 0, low: 0 },
alerts: [],
alertsTotal: 0,
alertsPage: 1,
alertsFilter: { status: 'open', severity: '' },
rules: [],
ruleModalOpen: false,
ruleEditId: null,
ruleEdit: { name: '', enabled: true, severity: 'medium', message: '', conditions: [] },
askQuestionText: '',
askLoading: false,
askAnswer: '',
@@ -269,9 +505,14 @@
await this.loadVersion();
await this.initAuth();
this.loadSavedFilters();
this.loadPanelState();
if (!this.authConfig?.auth_enabled || this.accessToken) {
await this.loadFilterOptions();
await this.loadSavedSearches();
await this.loadSourceHealth();
await this.loadAlertSummary();
await this.loadAlerts();
await this.loadRules();
await this.loadEvents();
}
},
@@ -294,12 +535,33 @@
} catch {}
},
loadPanelState() {
try {
const saved = localStorage.getItem('aoc_panels');
if (saved) {
const parsed = JSON.parse(saved);
Object.keys(parsed).forEach((k) => { if (this.panelState[k] !== undefined) this.panelState[k] = parsed[k]; });
}
} catch {}
},
savePanelState() {
try {
localStorage.setItem('aoc_panels', JSON.stringify(this.panelState));
} catch {}
},
togglePanel(key) {
this.panelState[key] = !this.panelState[key];
this.savePanelState();
},
async loadVersion() {
try {
const res = await fetch('/api/version');
if (res.ok) {
const body = await res.json();
this.appVersion = body.version || '';
this.appVersion = (body.version || '').replace(/^v/, '');
}
} catch {}
},
@@ -339,6 +601,11 @@
if (featRes.ok) {
const featBody = await featRes.json();
this.aiFeaturesEnabled = featBody.ai_features_enabled !== false;
if (featBody.default_page_size) {
this.filters.limit = featBody.default_page_size;
} else {
this.filters.limit = 24;
}
} else {
this.aiFeaturesEnabled = true;
}
@@ -507,9 +774,8 @@
const saved = localStorage.getItem('aoc_filters');
if (!saved && this.options.services.length) {
// Default: exclude noisy high-volume services
const noisy = ['Exchange', 'SharePoint'];
this.filters.selectedServices = this.options.services.filter((s) => !noisy.includes(s));
// Default: show all services (privacy controls handle exclusions server-side)
this.filters.selectedServices = [...this.options.services];
} else if (saved) {
try {
const parsed = JSON.parse(saved);
@@ -529,6 +795,59 @@
} catch {}
},
async loadSavedSearches() {
try {
const res = await fetch('/api/saved-searches', { headers: this.authHeader() });
if (!res.ok) return;
this.savedSearches = await res.json();
} catch {}
},
async saveCurrentFilters() {
const name = prompt('Name this saved filter:');
if (!name || !name.trim()) return;
try {
const res = await fetch('/api/saved-searches', {
method: 'POST',
headers: { 'Content-Type': 'application/json', ...this.authHeader() },
body: JSON.stringify({ name: name.trim(), filters: { ...this.filters } }),
});
if (!res.ok) throw new Error(await res.text());
const created = await res.json();
this.savedSearches.unshift(created);
this.statusText = 'Filters saved.';
setTimeout(() => { if (this.statusText === 'Filters saved.') this.statusText = ''; }, 2000);
} catch (err) {
this.statusText = err.message || 'Failed to save filters.';
}
},
applySavedSearch(ss) {
if (!ss || !ss.filters) return;
const fields = ['actor', 'selectedServices', 'search', 'operation', 'result', 'start', 'end', 'limit', 'includeTags', 'excludeTags'];
fields.forEach((f) => {
if (ss.filters[f] !== undefined) this.filters[f] = ss.filters[f];
});
// Validate selectedServices against current options
this.filters.selectedServices = this.filters.selectedServices.filter((s) => this.options.services.includes(s));
this.resetPagination();
this.loadEvents();
},
async deleteSavedSearch(id) {
if (!confirm('Delete this saved search?')) return;
try {
const res = await fetch(`/api/saved-searches/${id}`, {
method: 'DELETE',
headers: this.authHeader(),
});
if (!res.ok) throw new Error(await res.text());
this.savedSearches = this.savedSearches.filter((s) => s.id !== id);
} catch (err) {
this.statusText = err.message || 'Failed to delete saved search.';
}
},
resetPagination() {
this.cursorStack = [];
this.nextCursor = null;
@@ -550,13 +869,137 @@
},
clearFilters() {
const noisy = ['Exchange', 'SharePoint'];
this.filters = { actor: '', selectedServices: this.options.services.filter((s) => !noisy.includes(s)), search: '', operation: '', result: '', start: '', end: '', limit: 100, includeTags: '', excludeTags: '' };
this.filters = { actor: '', selectedServices: [...this.options.services], search: '', operation: '', result: '', start: '', end: '', limit: 24, includeTags: '', excludeTags: '' };
this.saveFilters();
this.resetPagination();
this.loadEvents();
},
filterByService(service) {
if (!service) return;
this.filters.selectedServices = [service];
this.saveFilters();
this.resetPagination();
this.loadEvents();
},
filterByResult(result) {
if (!result) return;
this.filters.result = this.filters.result === result ? '' : result;
this.saveFilters();
this.resetPagination();
this.loadEvents();
},
async loadAlertSummary() {
try {
const res = await fetch('/api/alerts/summary', { headers: this.authHeader() });
if (!res.ok) return;
const body = await res.json();
this.alertSummary.total_open = body.total_open || 0;
const sev = body.by_status_severity || [];
this.alertSummary.high = sev.filter((s) => s._id.severity === 'high' && s._id.status === 'open').reduce((a, b) => a + b.count, 0);
this.alertSummary.medium = sev.filter((s) => s._id.severity === 'medium' && s._id.status === 'open').reduce((a, b) => a + b.count, 0);
this.alertSummary.low = sev.filter((s) => s._id.severity === 'low' && s._id.status === 'open').reduce((a, b) => a + b.count, 0);
} catch {}
},
async loadAlerts() {
try {
const params = new URLSearchParams();
params.append('page_size', '20');
params.append('page', String(this.alertsPage));
if (this.alertsFilter.status) params.append('status', this.alertsFilter.status);
if (this.alertsFilter.severity) params.append('severity', this.alertsFilter.severity);
const res = await fetch(`/api/alerts?${params.toString()}`, { headers: this.authHeader() });
if (!res.ok) return;
const body = await res.json();
this.alerts = body.items || [];
this.alertsTotal = body.total || 0;
} catch {}
},
async updateAlertStatus(alertId, status) {
try {
const res = await fetch(`/api/alerts/${alertId}/status`, {
method: 'PATCH',
headers: { 'Content-Type': 'application/json', ...this.authHeader() },
body: JSON.stringify({ status }),
});
if (res.ok) {
await this.loadAlerts();
await this.loadAlertSummary();
}
} catch {}
},
async loadRules() {
try {
const res = await fetch('/api/rules', { headers: this.authHeader() });
if (!res.ok) return;
this.rules = await res.json();
} catch {}
},
openRuleEditor(rule) {
if (rule) {
this.ruleEditId = rule.id;
this.ruleEdit = {
name: rule.name,
enabled: rule.enabled,
severity: rule.severity,
message: rule.message,
conditions: JSON.parse(JSON.stringify(rule.conditions)),
};
} else {
this.ruleEditId = null;
this.ruleEdit = { name: '', enabled: true, severity: 'medium', message: '', conditions: [] };
}
this.ruleModalOpen = true;
},
async saveRule() {
const payload = { ...this.ruleEdit };
try {
const url = this.ruleEditId ? `/api/rules/${this.ruleEditId}` : '/api/rules';
const method = this.ruleEditId ? 'PUT' : 'POST';
const res = await fetch(url, {
method,
headers: { 'Content-Type': 'application/json', ...this.authHeader() },
body: JSON.stringify(payload),
});
if (!res.ok) throw new Error(await res.text());
this.ruleModalOpen = false;
await this.loadRules();
} catch (err) {
alert('Failed to save rule: ' + err.message);
}
},
async toggleRule(ruleId, enabled) {
try {
const rule = this.rules.find((r) => r.id === ruleId);
if (!rule) return;
const res = await fetch(`/api/rules/${ruleId}`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json', ...this.authHeader() },
body: JSON.stringify({ ...rule, enabled }),
});
if (res.ok) await this.loadRules();
} catch {}
},
async deleteRule(ruleId) {
if (!confirm('Delete this rule?')) return;
try {
const res = await fetch(`/api/rules/${ruleId}`, {
method: 'DELETE',
headers: this.authHeader(),
});
if (res.ok) await this.loadRules();
} catch {}
},
async askQuestion() {
const q = this.askQuestionText.trim();
if (!q) return;

View File

@@ -28,7 +28,115 @@ body {
.page {
max-width: 1100px;
margin: 0 auto;
padding: 32px 20px 60px;
padding: 0 20px 40px;
display: flex;
flex-direction: column;
min-height: 100vh;
}
.topbar {
display: flex;
align-items: center;
gap: 16px;
padding: 12px 0;
margin-bottom: 8px;
border-bottom: 1px solid var(--border);
flex-wrap: wrap;
}
.topbar__brand {
display: flex;
align-items: center;
gap: 8px;
font-weight: 700;
font-size: 16px;
}
.topbar__logo {
font-size: 20px;
}
.topbar__links {
display: flex;
gap: 16px;
margin-right: auto;
}
.topbar__links a {
color: var(--muted);
font-size: 13px;
text-decoration: none;
font-weight: 500;
transition: color 0.15s ease;
}
.topbar__links a:hover {
color: var(--accent-strong);
}
.topbar__meta {
display: flex;
align-items: center;
gap: 10px;
}
.user-chip {
display: flex;
align-items: center;
gap: 8px;
background: rgba(255, 255, 255, 0.04);
border: 1px solid var(--border);
border-radius: 999px;
padding: 4px 12px 4px 4px;
}
.user-avatar {
width: 26px;
height: 26px;
border-radius: 50%;
background: linear-gradient(135deg, var(--accent), var(--accent-strong));
color: #0b1220;
font-size: 12px;
font-weight: 700;
display: flex;
align-items: center;
justify-content: center;
flex-shrink: 0;
}
.user-details {
display: flex;
flex-direction: column;
line-height: 1.2;
}
.user-name {
font-size: 12px;
font-weight: 600;
color: var(--text);
}
.user-email {
font-size: 11px;
color: var(--muted);
}
.login-hint {
font-size: 12px;
color: var(--muted);
font-style: italic;
}
.topbar__actions {
display: flex;
gap: 8px;
align-items: center;
}
.btn--compact {
padding: 8px 14px;
font-size: 13px;
border-radius: 8px;
}
.hero {
@@ -37,6 +145,7 @@ body {
justify-content: space-between;
gap: 16px;
margin-bottom: 20px;
padding-top: 16px;
}
.eyebrow {
@@ -165,6 +274,31 @@ input {
margin-bottom: 8px;
}
.panel-header--collapsible {
cursor: pointer;
user-select: none;
padding: 4px 0;
margin-bottom: 0;
}
.panel-header--collapsible:hover {
opacity: 0.85;
}
.panel-toggle {
display: inline-block;
font-size: 14px;
color: var(--muted);
transition: transform 0.2s ease;
transform: rotate(-90deg);
width: 16px;
text-align: center;
}
.panel-toggle--open {
transform: rotate(0deg);
}
#count {
color: var(--muted);
font-size: 14px;
@@ -246,6 +380,27 @@ input {
border-color: rgba(239, 68, 68, 0.5);
}
.pill--clickable {
cursor: pointer;
transition: transform 0.1s ease, box-shadow 0.15s ease, background 0.15s ease;
}
.pill--clickable:hover {
transform: translateY(-1px);
box-shadow: 0 2px 8px rgba(125, 211, 252, 0.2);
background: rgba(125, 211, 252, 0.2);
}
.pill--clickable.pill--ok:hover {
box-shadow: 0 2px 8px rgba(34, 197, 94, 0.2);
background: rgba(34, 197, 94, 0.25);
}
.pill--clickable.pill--warn:hover {
box-shadow: 0 2px 8px rgba(249, 115, 22, 0.2);
background: rgba(249, 115, 22, 0.25);
}
.event h3 {
margin: 0 0 6px;
font-size: 17px;
@@ -370,6 +525,14 @@ input {
align-items: center;
}
.saved-searches {
display: flex;
flex-wrap: wrap;
gap: 8px;
align-items: center;
font-size: 13px;
}
.modal__explanation {
background: rgba(255, 255, 255, 0.03);
border: 1px solid var(--border);
@@ -500,7 +663,321 @@ input {
gap: 4px;
}
.footer {
margin-top: auto;
padding: 20px 0;
border-top: 1px solid var(--border);
display: flex;
align-items: center;
justify-content: space-between;
gap: 16px;
flex-wrap: wrap;
font-size: 13px;
color: var(--muted);
}
.footer__left {
display: flex;
align-items: center;
gap: 10px;
}
.footer__brand {
font-weight: 600;
color: var(--text);
}
.footer__version {
font-size: 11px;
padding: 2px 8px;
border-radius: 999px;
background: rgba(125, 211, 252, 0.1);
border: 1px solid rgba(125, 211, 252, 0.2);
color: var(--accent-strong);
}
.footer__center {
display: flex;
gap: 16px;
align-items: center;
}
.footer__center a {
color: var(--muted);
text-decoration: none;
transition: color 0.15s ease;
}
.footer__center a:hover {
color: var(--accent-strong);
}
.footer__right {
font-size: 12px;
}
/* Alert summary in hero */
.alert-summary {
display: flex;
align-items: center;
gap: 6px;
background: rgba(255, 255, 255, 0.04);
border: 1px solid var(--border);
border-radius: 999px;
padding: 6px 14px;
}
.alert-badge {
min-width: 22px;
height: 22px;
border-radius: 999px;
display: flex;
align-items: center;
justify-content: center;
font-size: 11px;
font-weight: 700;
color: #0b1220;
}
.alert-badge--high {
background: #ef4444;
}
.alert-badge--medium {
background: #f97316;
}
.alert-badge--low {
background: #3b82f6;
}
.alert-label {
font-size: 12px;
color: var(--muted);
}
.alert-open-count {
font-size: 13px;
color: var(--muted);
}
.alert-filters {
display: flex;
gap: 10px;
margin-bottom: 12px;
}
.alert-filters select {
padding: 8px 12px;
border-radius: 8px;
border: 1px solid var(--border);
background: rgba(255, 255, 255, 0.02);
color: var(--text);
font-size: 13px;
}
.alerts-list {
display: flex;
flex-direction: column;
gap: 10px;
}
.alert-card {
border: 1px solid var(--border);
border-radius: 12px;
padding: 12px 14px;
background: rgba(255, 255, 255, 0.02);
border-left: 3px solid transparent;
}
.alert-card--high {
border-left-color: #ef4444;
}
.alert-card--medium {
border-left-color: #f97316;
}
.alert-card--low {
border-left-color: #3b82f6;
}
.alert-card__meta {
display: flex;
gap: 8px;
align-items: center;
margin-bottom: 6px;
flex-wrap: wrap;
}
.alert-card__meta small {
color: var(--muted);
font-size: 12px;
}
.alert-card strong {
font-size: 14px;
display: block;
margin-bottom: 4px;
}
.alert-card p {
margin: 0 0 10px;
font-size: 13px;
color: var(--muted);
line-height: 1.45;
}
.alert-card__actions {
display: flex;
gap: 8px;
flex-wrap: wrap;
}
.alerts-empty {
padding: 20px;
text-align: center;
color: var(--muted);
font-size: 14px;
border: 1px dashed var(--border);
border-radius: 10px;
}
/* Rules management */
.rules-list {
display: flex;
flex-direction: column;
gap: 10px;
}
.rule-card {
border: 1px solid var(--border);
border-radius: 12px;
padding: 12px 14px;
background: rgba(255, 255, 255, 0.02);
}
.rule-card--disabled {
opacity: 0.6;
}
.rule-card__meta {
display: flex;
gap: 8px;
align-items: center;
margin-bottom: 6px;
flex-wrap: wrap;
}
.toggle-label {
display: flex;
align-items: center;
gap: 6px;
font-size: 12px;
color: var(--muted);
cursor: pointer;
}
.toggle-label input[type="checkbox"] {
width: 14px;
height: 14px;
accent-color: var(--accent-strong);
}
.rule-card strong {
font-size: 14px;
display: block;
margin-bottom: 4px;
}
.rule-card p {
margin: 0 0 8px;
font-size: 13px;
color: var(--muted);
line-height: 1.4;
}
.rule-card__conditions {
display: flex;
flex-wrap: wrap;
gap: 6px;
margin-bottom: 10px;
}
.rule-card__actions {
display: flex;
gap: 8px;
}
.rules-empty {
padding: 20px;
text-align: center;
color: var(--muted);
font-size: 14px;
border: 1px dashed var(--border);
border-radius: 10px;
}
.rule-form {
display: flex;
flex-direction: column;
gap: 14px;
}
.rule-form label {
display: flex;
flex-direction: column;
gap: 6px;
font-size: 14px;
color: var(--muted);
}
.rule-form input,
.rule-form select,
.rule-form textarea {
padding: 10px 12px;
border-radius: 10px;
border: 1px solid var(--border);
background: rgba(255, 255, 255, 0.02);
color: var(--text);
font-size: 14px;
}
.rule-conditions {
display: flex;
flex-direction: column;
gap: 10px;
}
.condition-row {
display: flex;
gap: 8px;
align-items: center;
}
.condition-row input,
.condition-row select {
flex: 1;
min-width: 0;
}
.rule-form__actions {
display: flex;
gap: 10px;
margin-top: 8px;
}
@media (max-width: 640px) {
.topbar {
flex-direction: column;
align-items: flex-start;
gap: 10px;
}
.topbar__links {
margin-right: 0;
}
.hero {
flex-direction: column;
}
@@ -514,4 +991,10 @@ input {
flex-direction: column;
align-items: stretch;
}
.footer {
flex-direction: column;
text-align: center;
gap: 10px;
}
}

117
backend/jobs.py Normal file
View File

@@ -0,0 +1,117 @@
"""arq job functions for async LLM processing."""
import hashlib
import json
import structlog
from arq.connections import RedisSettings
from config import REDIS_URL
logger = structlog.get_logger("aoc.jobs")
# ---------------------------------------------------------------------------
# Cache helpers
# ---------------------------------------------------------------------------
CACHE_TTL_ASK = 3600 # 1 hour
CACHE_TTL_EXPLAIN = 86400 # 24 hours
def _ask_cache_key(question: str, filters: dict, events: list) -> str:
payload = json.dumps({"q": question, "f": filters, "e": [e.get("id") for e in events]}, sort_keys=True)
return f"aoc:cache:ask:{hashlib.md5(payload.encode()).hexdigest()}"
def _explain_cache_key(event_id: str) -> str:
return f"aoc:cache:explain:{event_id}"
async def get_cached_ask(redis, question: str, filters: dict, events: list) -> dict | None:
key = _ask_cache_key(question, filters, events)
raw = await redis.get(key)
if raw:
return json.loads(raw)
return None
async def set_cached_ask(redis, question: str, filters: dict, events: list, result: dict):
key = _ask_cache_key(question, filters, events)
await redis.setex(key, CACHE_TTL_ASK, json.dumps(result, default=str))
async def get_cached_explain(redis, event_id: str) -> dict | None:
key = _explain_cache_key(event_id)
raw = await redis.get(key)
if raw:
return json.loads(raw)
return None
async def set_cached_explain(redis, event_id: str, result: dict):
key = _explain_cache_key(event_id)
await redis.setex(key, CACHE_TTL_EXPLAIN, json.dumps(result, default=str))
# ---------------------------------------------------------------------------
# arq job functions
# ---------------------------------------------------------------------------
async def process_ask_question(
ctx, question: str, filters: dict, events: list, total: int, excluded_services: list | None
):
"""Background job: call LLM for /api/ask and cache result."""
from routes.ask import _call_llm
redis = ctx["redis"]
try:
answer = await _call_llm(question, events, total=total, excluded_services=excluded_services)
result = {"status": "completed", "answer": answer, "llm_used": True, "llm_error": None}
except Exception as exc:
logger.warning("Async ask LLM failed", error=str(exc))
result = {"status": "failed", "answer": "", "llm_used": False, "llm_error": str(exc)}
await set_cached_ask(redis, question, filters, events, result)
return result
async def process_explain_event(ctx, event_id: str, event: dict, related: list):
"""Background job: call LLM for /api/events/{id}/explain and cache result."""
from routes.ask import _explain_event
redis = ctx["redis"]
try:
explanation = await _explain_event(event, related)
result = {"status": "completed", "explanation": explanation, "llm_used": True, "llm_error": None}
except Exception as exc:
logger.warning("Async explain LLM failed", error=str(exc))
result = {"status": "failed", "explanation": "", "llm_used": False, "llm_error": str(exc)}
await set_cached_explain(redis, event_id, result)
return result
# ---------------------------------------------------------------------------
# arq worker configuration
# ---------------------------------------------------------------------------
async def startup(ctx):
from redis.asyncio import Redis
ctx["redis"] = Redis.from_url(REDIS_URL, decode_responses=True)
async def shutdown(ctx):
await ctx["redis"].close()
class WorkerSettings:
functions = [process_ask_question, process_explain_event]
redis_settings = RedisSettings.from_dsn(REDIS_URL)
on_startup = startup
on_shutdown = shutdown
max_jobs = 10
job_timeout = 120
keep_result = 3600
keep_result_forever = False

View File

@@ -14,12 +14,15 @@ from fastapi.responses import Response
from fastapi.staticfiles import StaticFiles
from metrics import observe_request, prometheus_metrics
from middleware import CorrelationIdMiddleware
from routes.alerts import router as alerts_router
from routes.config import router as config_router
from routes.events import router as events_router
from routes.fetch import router as fetch_router
from routes.fetch import run_fetch
from routes.health import router as health_router
from routes.jobs import router as jobs_router
from routes.rules import router as rules_router
from routes.saved_searches import router as saved_searches_router
from routes.webhooks import router as webhooks_router
@@ -119,7 +122,10 @@ if AI_FEATURES_ENABLED:
from routes.mcp import mcp_asgi
app.mount("/mcp", mcp_asgi)
app.include_router(saved_searches_router, prefix="/api")
app.include_router(rules_router, prefix="/api")
app.include_router(alerts_router, prefix="/api")
app.include_router(jobs_router, prefix="/api")
@app.get("/health")
@@ -163,6 +169,9 @@ async def _periodic_fetch():
@app.on_event("startup")
async def start_periodic_fetch():
setup_indexes()
from rules import seed_default_rules
seed_default_rules()
if ENABLE_PERIODIC_FETCH:
app.state.fetch_task = asyncio.create_task(_periodic_fetch())
@@ -174,3 +183,6 @@ async def stop_periodic_fetch():
task.cancel()
with suppress(Exception):
await task
from redis_client import close_redis_connections
await close_redis_connections()

View File

@@ -82,6 +82,7 @@ class AskRequest(BaseModel):
end: str | None = None
include_tags: list[str] | None = None
exclude_tags: list[str] | None = None
async_mode: bool = False # enqueue async job instead of waiting
class AskEventRef(BaseModel):
@@ -101,3 +102,4 @@ class AskResponse(BaseModel):
query_info: dict
llm_used: bool
llm_error: str | None = None
job_id: str | None = None

172
backend/notifications.py Normal file
View File

@@ -0,0 +1,172 @@
"""Pluggable notification channels for admin-ops alerts.
Supported channels:
- webhook: POST JSON to any URL (Slack, Teams, generic)
"""
from datetime import UTC, datetime
import requests
import structlog
from tenacity import retry, retry_if_exception_type, stop_after_attempt, wait_exponential
logger = structlog.get_logger("aoc.notifications")
WEBHOOK_TIMEOUT = 15
@retry(
stop=stop_after_attempt(3),
wait=wait_exponential(multiplier=1, min=2, max=10),
retry=retry_if_exception_type((requests.ConnectionError, requests.Timeout)),
reraise=True,
)
def _post_webhook(url: str, payload: dict) -> requests.Response:
"""POST to webhook with retry on connection/timeout errors."""
return requests.post(url, json=payload, timeout=WEBHOOK_TIMEOUT, headers={"Content-Type": "application/json"})
def _build_slack_payload(rule_name: str, severity: str, message: str, event: dict) -> dict:
"""Build a Slack-compatible block payload."""
color = {"high": "#ef4444", "medium": "#f97316", "low": "#3b82f6"}.get(severity, "#94a3b8")
ts = event.get("timestamp", "?")
op = event.get("operation", "unknown")
actor = event.get("actor_display", "unknown")
targets = ", ".join(event.get("target_displays", [])) or ""
svc = event.get("service", "unknown")
return {
"text": f"[{severity.upper()}] {rule_name}: {message}",
"attachments": [
{
"color": color,
"fields": [
{"title": "Rule", "value": rule_name, "short": True},
{"title": "Severity", "value": severity.upper(), "short": True},
{"title": "Service", "value": svc, "short": True},
{"title": "Action", "value": op, "short": True},
{"title": "Actor", "value": actor, "short": True},
{"title": "Target", "value": targets, "short": True},
{"title": "Time", "value": ts, "short": False},
],
"footer": "AOC Admin Operations Center",
}
],
}
def _build_teams_payload(rule_name: str, severity: str, message: str, event: dict) -> dict:
"""Build a Microsoft Teams adaptive card payload."""
color = {"high": "Attention", "medium": "Warning", "low": "Good"}.get(severity, "Default")
ts = event.get("timestamp", "?")
op = event.get("operation", "unknown")
actor = event.get("actor_display", "unknown")
targets = ", ".join(event.get("target_displays", [])) or ""
svc = event.get("service", "unknown")
return {
"type": "message",
"attachments": [
{
"contentType": "application/vnd.microsoft.card.adaptive",
"content": {
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"type": "AdaptiveCard",
"version": "1.4",
"body": [
{
"type": "TextBlock",
"text": f"🚨 {severity.upper()}: {rule_name}",
"weight": "Bolder",
"size": "Medium",
"color": color,
},
{"type": "TextBlock", "text": message, "wrap": True},
{
"type": "FactSet",
"facts": [
{"title": "Service:", "value": svc},
{"title": "Action:", "value": op},
{"title": "Actor:", "value": actor},
{"title": "Target:", "value": targets},
{"title": "Time:", "value": ts},
],
},
],
},
}
],
}
def _build_generic_payload(rule_name: str, severity: str, message: str, event: dict) -> dict:
"""Build a generic JSON payload."""
return {
"alert": {
"rule_name": rule_name,
"severity": severity,
"message": message,
"timestamp": datetime.now(UTC).isoformat(),
},
"event": {
"id": event.get("id"),
"timestamp": event.get("timestamp"),
"service": event.get("service"),
"operation": event.get("operation"),
"actor_display": event.get("actor_display"),
"target_displays": event.get("target_displays"),
"result": event.get("result"),
},
}
def send_notification(
webhook_url: str,
format_type: str,
rule_name: str,
severity: str,
message: str,
event: dict,
) -> bool:
"""Send an alert notification to the configured channel.
Args:
webhook_url: URL to POST to.
format_type: "slack", "teams", or "generic".
rule_name: Name of the triggered rule.
severity: high, medium, or low.
message: Human-readable alert message.
event: The normalized event that triggered the alert.
Returns:
True if delivery succeeded, False otherwise.
"""
if not webhook_url:
return False
builders = {
"slack": _build_slack_payload,
"teams": _build_teams_payload,
"generic": _build_generic_payload,
}
builder = builders.get(format_type, _build_generic_payload)
payload = builder(rule_name, severity, message, event)
try:
res = _post_webhook(webhook_url, payload)
res.raise_for_status()
logger.info(
"Notification sent",
rule=rule_name,
severity=severity,
format=format_type,
status_code=res.status_code,
)
return True
except Exception as exc:
logger.warning(
"Notification failed after retries",
rule=rule_name,
severity=severity,
format=format_type,
error=str(exc),
)
return False

36
backend/redis_client.py Normal file
View File

@@ -0,0 +1,36 @@
"""Async Redis client singleton for caching and job queue."""
import redis.asyncio as aioredis
from arq import create_pool
from arq.connections import ArqRedis, RedisSettings
from config import REDIS_URL
_arq_pool: ArqRedis | None = None
_plain_redis: aioredis.Redis | None = None
async def get_arq_pool() -> ArqRedis:
"""Return a shared arq pool (ArqRedis extends redis.asyncio.Redis)."""
global _arq_pool
if _arq_pool is None:
_arq_pool = await create_pool(RedisSettings.from_dsn(REDIS_URL))
return _arq_pool
async def get_redis() -> aioredis.Redis:
"""Return a shared plain async Redis client."""
global _plain_redis
if _plain_redis is None:
_plain_redis = aioredis.from_url(REDIS_URL, decode_responses=True)
return _plain_redis
async def close_redis_connections():
"""Close all Redis connections (call on shutdown)."""
global _arq_pool, _plain_redis
if _arq_pool:
await _arq_pool.close()
_arq_pool = None
if _plain_redis:
await _plain_redis.close()
_plain_redis = None

View File

@@ -14,3 +14,5 @@ prometheus-client
httpx
gunicorn
mcp
redis
arq

78
backend/routes/alerts.py Normal file
View File

@@ -0,0 +1,78 @@
"""Alert management endpoints."""
from auth import require_auth
from bson import ObjectId
from database import alerts_collection
from fastapi import APIRouter, Depends, HTTPException, Query
from pydantic import BaseModel
router = APIRouter(dependencies=[Depends(require_auth)])
class AlertStatusUpdate(BaseModel):
status: str # open | acknowledged | resolved | false_positive
class AlertListResponse(BaseModel):
items: list[dict]
total: int
@router.get("/alerts", response_model=AlertListResponse)
def list_alerts(
status: str = Query(default="", description="Filter by status"),
severity: str = Query(default="", description="Filter by severity"),
rule_name: str = Query(default="", description="Filter by rule name"),
page_size: int = Query(default=50, ge=1, le=200),
page: int = Query(default=1, ge=1),
):
query = {}
if status:
query["status"] = status
if severity:
query["severity"] = severity
if rule_name:
query["rule_name"] = {"$regex": rule_name, "$options": "i"}
total = alerts_collection.count_documents(query)
skip = (page - 1) * page_size
cursor = alerts_collection.find(query, {"_id": 0}).sort("timestamp", -1).skip(skip).limit(page_size)
return {"items": list(cursor), "total": total}
@router.patch("/alerts/{alert_id}/status")
def update_alert_status(alert_id: str, body: AlertStatusUpdate):
result = alerts_collection.update_one(
{"_id": ObjectId(alert_id)},
{"$set": {"status": body.status}},
)
if result.matched_count == 0:
raise HTTPException(status_code=404, detail="Alert not found")
return {"updated": True, "status": body.status}
@router.get("/alerts/summary")
def alert_summary():
"""Return counts by status and severity for the dashboard."""
pipeline = [
{
"$group": {
"_id": {"status": "$status", "severity": "$severity"},
"count": {"$sum": 1},
}
}
]
by_status_severity = list(alerts_collection.aggregate(pipeline))
total_open = alerts_collection.count_documents({"status": "open"})
total_acknowledged = alerts_collection.count_documents({"status": "acknowledged"})
total_resolved = alerts_collection.count_documents({"status": "resolved"})
total_false_positive = alerts_collection.count_documents({"status": "false_positive"})
return {
"total_open": total_open,
"total_acknowledged": total_acknowledged,
"total_resolved": total_resolved,
"total_false_positive": total_false_positive,
"by_status_severity": by_status_severity,
}

View File

@@ -1,14 +1,26 @@
import asyncio
import json
import re
from datetime import UTC, datetime, timedelta
import httpx
import structlog
from auth import require_auth
from config import LLM_API_KEY, LLM_API_VERSION, LLM_BASE_URL, LLM_MAX_EVENTS, LLM_MODEL, LLM_TIMEOUT_SECONDS
from auth import require_auth, user_can_access_privacy_services
from config import (
LLM_API_KEY,
LLM_API_VERSION,
LLM_BASE_URL,
LLM_MAX_EVENTS,
LLM_MODEL,
LLM_TIMEOUT_SECONDS,
PRIVACY_SENSITIVE_OPERATIONS,
PRIVACY_SERVICES,
)
from database import events_collection
from fastapi import APIRouter, Depends, HTTPException
from jobs import get_cached_ask, get_cached_explain, set_cached_ask, set_cached_explain
from models.api import AskRequest, AskResponse
from redis_client import get_arq_pool
router = APIRouter(dependencies=[Depends(require_auth)])
logger = structlog.get_logger("aoc.ask")
@@ -49,7 +61,7 @@ _SERVICE_INTENTS = {
# Services that are extremely noisy for typical admin questions.
# We exclude them by default on broad questions unless the user explicitly mentions them.
_NOISY_SERVICES = {"Exchange", "SharePoint"}
_NOISY_SERVICES = {"Exchange", "SharePoint", "Teams"}
# Services that are generally admin-relevant and kept by default.
_DEFAULT_ADMIN_SERVICES = {
@@ -471,11 +483,73 @@ Do not invent facts that are not in the data.
"""
_GUID_RE = re.compile(r"^[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}$")
def _extract_guids(obj: dict | list | str) -> set[str]:
"""Recursively extract UUID-like strings from a JSON structure."""
guids = set()
if isinstance(obj, dict):
for k, v in obj.items():
if k.lower() in ("id", "groupid", "userid", "targetid") and isinstance(v, str) and _GUID_RE.match(v):
guids.add(v)
guids.update(_extract_guids(v))
elif isinstance(obj, list):
for item in obj:
guids.update(_extract_guids(item))
elif isinstance(obj, str) and _GUID_RE.match(obj):
guids.add(obj)
return guids
async def _resolve_guids_for_event(event: dict) -> dict[str, str]:
"""Try to resolve GUIDs in an event to human-readable names via Graph API."""
raw = event.get("raw") or {}
guids = _extract_guids(raw)
# Also include any GUIDs in targetResources that might not have displayName
for tr in raw.get("targetResources") or []:
tid = tr.get("id")
if tid and _GUID_RE.match(tid):
guids.add(tid)
for tr in raw.get("modifiedProperties") or []:
for key in ("oldValue", "newValue"):
val = tr.get(key)
if val and _GUID_RE.match(val):
guids.add(val)
if not guids:
return {}
try:
from graph.auth import get_access_token
from graph.resolve import resolve_directory_object
token = await asyncio.to_thread(get_access_token)
cache: dict[str, dict] = {}
resolved = {}
for gid in guids:
result = await asyncio.to_thread(resolve_directory_object, gid, token, cache)
if result:
resolved[gid] = result["name"]
return resolved
except Exception as exc:
logger.warning("GUID resolution failed", error=str(exc))
return {}
async def _explain_event(event: dict, related: list[dict]) -> str:
if not LLM_API_KEY:
raise RuntimeError("LLM_API_KEY not configured")
# Resolve GUIDs to names before sending to LLM
resolved = await _resolve_guids_for_event(event)
event_text = json.dumps(event, indent=2, default=str)
resolution_text = ""
if resolved:
resolution_text = "\nResolved GUIDs:\n"
for gid, name in resolved.items():
resolution_text += f" {gid}{name}\n"
related_text = ""
if related:
@@ -492,7 +566,7 @@ async def _explain_event(event: dict, related: list[dict]) -> str:
{"role": "system", "content": _EXPLAIN_SYSTEM_PROMPT},
{
"role": "user",
"content": f"Audit event:\n{event_text}{related_text}\n\nPlease explain this event.",
"content": f"Audit event:\n{event_text}{resolution_text}{related_text}\n\nPlease explain this event.",
},
]
@@ -525,6 +599,11 @@ async def explain_event(event_id: str, user: dict = Depends(require_auth)):
if not event:
raise HTTPException(status_code=404, detail="Event not found")
if (
event.get("service") in PRIVACY_SERVICES or event.get("operation") in PRIVACY_SENSITIVE_OPERATIONS
) and not user_can_access_privacy_services(user):
raise HTTPException(status_code=403, detail="Access to this event is restricted")
event.pop("_id", None)
# Fetch related events for context (same actor or target in last 24h)
@@ -563,14 +642,23 @@ async def explain_event(event_id: str, user: dict = Depends(require_auth)):
"llm_error": "LLM_API_KEY not configured",
}
# Check cache first
redis = await get_arq_pool()
cached = await get_cached_explain(redis, event_id)
if cached:
cached["related_count"] = len(related)
return cached
try:
explanation = await _explain_event(event, related)
return {
result = {
"explanation": explanation,
"llm_used": True,
"llm_error": None,
"related_count": len(related),
}
await set_cached_explain(redis, event_id, result)
return result
except Exception as exc:
logger.warning("Event explanation failed", error=str(exc))
return {
@@ -615,6 +703,8 @@ async def ask_question(body: AskRequest, user: dict = Depends(require_auth)):
# -----------------------------------------------------------------------
# Build and run query
# -----------------------------------------------------------------------
privacy_excluded_services = [] if user_can_access_privacy_services(user) else list(PRIVACY_SERVICES)
privacy_excluded_ops = [] if user_can_access_privacy_services(user) else list(PRIVACY_SENSITIVE_OPERATIONS)
query = _build_event_query(
entity,
start,
@@ -626,6 +716,13 @@ async def ask_question(body: AskRequest, user: dict = Depends(require_auth)):
include_tags=body.include_tags,
exclude_tags=body.exclude_tags,
)
extra_filters = []
if privacy_excluded_services:
extra_filters.append({"service": {"$nin": privacy_excluded_services}})
if privacy_excluded_ops:
extra_filters.append({"operation": {"$nin": privacy_excluded_ops}})
if extra_filters:
query["$and"] = query.get("$and", []) + extra_filters
try:
total = events_collection.count_documents(query)
@@ -660,19 +757,78 @@ async def ask_question(body: AskRequest, user: dict = Depends(require_auth)):
llm_error="LLM not used — no events found." if not LLM_API_KEY else None,
)
# Try LLM summarisation
# Try LLM summarisation (with caching + optional async)
answer = ""
llm_used = False
llm_error = None
if not LLM_API_KEY:
llm_error = "LLM_API_KEY is not configured. Set it in your .env to enable AI narrative summarisation."
job_id = None
filters_snapshot = {
"services": body.services,
"actor": body.actor,
"operation": body.operation,
"result": body.result,
"start": body.start,
"end": body.end,
"include_tags": body.include_tags,
"exclude_tags": body.exclude_tags,
}
if LLM_API_KEY:
redis = await get_arq_pool()
cached = await get_cached_ask(redis, question, filters_snapshot, events)
if cached:
answer = cached.get("answer", "")
llm_used = cached.get("llm_used", False)
llm_error = cached.get("llm_error")
elif body.async_mode:
pool = await get_arq_pool()
job = await pool.enqueue_job(
"process_ask_question",
question,
filters_snapshot,
events,
total,
excluded_services,
)
job_id = job.job_id if job else None
return AskResponse(
answer="Your question is being processed. Poll /api/jobs/{job_id} for the result.",
events=[_to_event_ref(e) for e in events],
query_info={
"entity": entity,
"start": start,
"end": end,
"event_count": len(events),
"total_matched": total,
"services_queried": query_services,
"excluded_services": excluded_services,
"mongo_query": json.dumps(query, default=str),
},
llm_used=False,
llm_error=None,
job_id=job_id,
)
else:
try:
answer = await _call_llm(question, events, total=total, excluded_services=excluded_services)
llm_used = True
await set_cached_ask(
redis,
question,
filters_snapshot,
events,
{
"answer": answer,
"llm_used": True,
"llm_error": None,
},
)
except Exception as exc:
llm_error = f"LLM call failed: {exc}"
logger.warning("LLM call failed, falling back to structured summary", error=str(exc))
else:
try:
answer = await _call_llm(question, events, total=total, excluded_services=excluded_services)
llm_used = True
except Exception as exc:
llm_error = f"LLM call failed: {exc}"
logger.warning("LLM call failed, falling back to structured summary", error=str(exc))
llm_error = "LLM_API_KEY is not configured. Set it in your .env to enable AI narrative summarisation."
# Fallback: structured summary if LLM unavailable or failed
if not answer:
@@ -711,4 +867,5 @@ async def ask_question(body: AskRequest, user: dict = Depends(require_auth)):
},
llm_used=llm_used,
llm_error=llm_error,
job_id=job_id,
)

View File

@@ -4,6 +4,7 @@ from config import (
AUTH_ENABLED,
AUTH_SCOPE,
AUTH_TENANT_ID,
DEFAULT_PAGE_SIZE,
)
from fastapi import APIRouter
@@ -25,4 +26,5 @@ def auth_config():
def features_config():
return {
"ai_features_enabled": AI_FEATURES_ENABLED,
"default_page_size": DEFAULT_PAGE_SIZE,
}

View File

@@ -3,8 +3,9 @@ import re
from datetime import UTC, datetime
from audit_trail import log_action
from auth import require_auth
from auth import require_auth, user_can_access_privacy_services
from bson import ObjectId
from config import PRIVACY_SENSITIVE_OPERATIONS, PRIVACY_SERVICES
from database import events_collection
from fastapi import APIRouter, Depends, HTTPException, Query
from models.api import (
@@ -44,6 +45,7 @@ def _build_query(
cursor: str | None = None,
include_tags: list[str] | None = None,
exclude_tags: list[str] | None = None,
exclude_operations: list[str] | None = None,
) -> dict:
filters = []
@@ -51,6 +53,8 @@ def _build_query(
filters.append({"service": service})
if services:
filters.append({"service": {"$in": services}})
if exclude_operations:
filters.append({"operation": {"$nin": exclude_operations}})
if actor:
actor_safe = re.escape(actor)
filters.append(
@@ -125,6 +129,8 @@ def list_events(
exclude_tags: list[str] | None = Query(default=None),
user: dict = Depends(require_auth),
):
privacy_excluded_services = [] if user_can_access_privacy_services(user) else list(PRIVACY_SERVICES)
privacy_excluded_ops = [] if user_can_access_privacy_services(user) else list(PRIVACY_SENSITIVE_OPERATIONS)
query = _build_query(
service=service,
services=services,
@@ -137,7 +143,13 @@ def list_events(
cursor=cursor,
include_tags=include_tags,
exclude_tags=exclude_tags,
exclude_operations=privacy_excluded_ops,
)
if privacy_excluded_services:
query = query if query else {}
if "$and" not in query:
query = {"$and": [query]} if query else {"$and": []}
query["$and"].append({"service": {"$nin": privacy_excluded_services}})
safe_page_size = max(1, min(page_size, 500))
@@ -202,6 +214,8 @@ def bulk_tags(
exclude_tags: list[str] | None = Query(default=None),
user: dict = Depends(require_auth),
):
privacy_excluded_services = [] if user_can_access_privacy_services(user) else list(PRIVACY_SERVICES)
privacy_excluded_ops = [] if user_can_access_privacy_services(user) else list(PRIVACY_SENSITIVE_OPERATIONS)
query = _build_query(
service=service,
services=services,
@@ -213,7 +227,13 @@ def bulk_tags(
search=search,
include_tags=include_tags,
exclude_tags=exclude_tags,
exclude_operations=privacy_excluded_ops,
)
if privacy_excluded_services:
query = query if query else {}
if "$and" not in query:
query = {"$and": [query]} if query else {"$and": []}
query["$and"].append({"service": {"$nin": privacy_excluded_services}})
tags = [t.strip() for t in body.tags if t.strip()]
if not tags:
raise HTTPException(status_code=400, detail="No tags provided")
@@ -235,7 +255,10 @@ def bulk_tags(
@router.get("/filter-options", response_model=FilterOptionsResponse)
def filter_options(limit: int = Query(default=200, ge=1, le=1000)):
def filter_options(
limit: int = Query(default=200, ge=1, le=1000),
user: dict = Depends(require_auth),
):
safe_limit = max(1, min(limit, 1000))
try:
services = sorted(events_collection.distinct("service"))[:safe_limit]
@@ -247,6 +270,10 @@ def filter_options(limit: int = Query(default=200, ge=1, le=1000)):
except Exception as exc:
raise HTTPException(status_code=500, detail=f"Failed to load filter options: {exc}") from exc
if not user_can_access_privacy_services(user):
services = [s for s in services if s not in PRIVACY_SERVICES]
operations = [o for o in operations if o not in PRIVACY_SENSITIVE_OPERATIONS]
return {
"services": services,
"operations": operations,

43
backend/routes/jobs.py Normal file
View File

@@ -0,0 +1,43 @@
"""Job status endpoints for async LLM operations."""
from arq.jobs import Job, JobStatus
from auth import require_auth
from fastapi import APIRouter, Depends, HTTPException
from pydantic import BaseModel
from redis_client import get_redis
router = APIRouter(dependencies=[Depends(require_auth)])
class JobStatusResponse(BaseModel):
job_id: str
status: str # queued, in_progress, complete, not_found, deferred
result: dict | None = None
error: str | None = None
@router.get("/jobs/{job_id}", response_model=JobStatusResponse)
async def get_job_status(job_id: str, user: dict = Depends(require_auth)):
"""Poll for the result of an async LLM job."""
redis = await get_redis()
job = Job(job_id, redis)
status = await job.status()
if status == JobStatus.not_found:
raise HTTPException(status_code=404, detail="Job not found")
result = None
error = None
if status == JobStatus.complete:
try:
result_data = await job.result(timeout=0)
result = result_data if isinstance(result_data, dict) else {"data": str(result_data)}
except Exception as exc:
error = str(exc)
return JobStatusResponse(
job_id=job_id,
status=status.value,
result=result,
error=error,
)

View File

@@ -0,0 +1,60 @@
"""CRUD for saved filter searches (bookmarks)."""
import uuid
from datetime import UTC, datetime
import structlog
from auth import require_auth
from database import saved_searches_collection
from fastapi import APIRouter, Depends, HTTPException
router = APIRouter(dependencies=[Depends(require_auth)])
logger = structlog.get_logger("aoc.saved_searches")
def _user_sub(user: dict) -> str:
return user.get("sub", "anonymous")
@router.get("/saved-searches")
async def list_saved_searches(user: dict = Depends(require_auth)):
"""Return saved searches for the current user."""
sub = _user_sub(user)
cursor = saved_searches_collection.find({"created_by": sub}).sort("created_at", -1)
items = []
for doc in cursor:
doc["id"] = doc.pop("_id")
items.append(doc)
return items
@router.post("/saved-searches")
async def create_saved_search(body: dict, user: dict = Depends(require_auth)):
"""Save the current filter set."""
name = (body.get("name") or "").strip()
if not name:
raise HTTPException(status_code=400, detail="Name is required")
filters = body.get("filters") or {}
doc = {
"_id": str(uuid.uuid4()),
"name": name,
"filters": filters,
"created_at": datetime.now(UTC).isoformat().replace("+00:00", "Z"),
"created_by": _user_sub(user),
}
saved_searches_collection.insert_one(doc)
logger.info("Saved search created", name=name, user=doc["created_by"])
doc["id"] = doc.pop("_id")
return doc
@router.delete("/saved-searches/{search_id}")
async def delete_saved_search(search_id: str, user: dict = Depends(require_auth)):
"""Delete a saved search (only if owned by current user)."""
sub = _user_sub(user)
result = saved_searches_collection.delete_one({"_id": search_id, "created_by": sub})
if result.deleted_count == 0:
raise HTTPException(status_code=404, detail="Saved search not found")
logger.info("Saved search deleted", search_id=search_id, user=sub)
return {"status": "deleted"}

View File

@@ -1,7 +1,18 @@
from datetime import UTC, datetime
"""Rule-based alerting for admin operations.
Rules are evaluated during event ingestion. Triggered alerts are stored in MongoDB
and optionally forwarded to a notification channel (webhook, Slack, Teams).
Deduplication: the same rule firing for the same actor within ALERT_DEDUPE_MINUTES
produces only one alert.
"""
from datetime import UTC, datetime, timedelta
import structlog
from config import ALERT_DEDUPE_MINUTES, ALERT_WEBHOOK_FORMAT, ALERT_WEBHOOK_URL
from database import db
from pymongo import ASCENDING
logger = structlog.get_logger("aoc.rules")
rules_collection = db["alert_rules"]
@@ -18,6 +29,13 @@ def evaluate_event(event: dict) -> list[dict]:
rules = load_rules()
for rule in rules:
if _matches(rule, event):
if _is_duplicate(rule, event):
logger.debug(
"Alert deduplicated",
rule=rule.get("name"),
event_id=event.get("id"),
)
continue
triggered.append(rule)
_create_alert(rule, event)
return triggered
@@ -50,6 +68,9 @@ def _matches(rule: dict, event: dict) -> bool:
return False
except Exception:
return False
if op == "threshold_count":
# Threshold rules are evaluated at query time, not per-event
return False
return True
@@ -64,7 +85,22 @@ def _get_nested(obj: dict, path: str):
return val
def _is_duplicate(rule: dict, event: dict) -> bool:
"""Check if an alert for this rule + actor was recently created."""
if ALERT_DEDUPE_MINUTES <= 0:
return False
cutoff = (datetime.now(UTC) - timedelta(minutes=ALERT_DEDUPE_MINUTES)).isoformat()
actor = event.get("actor_display") or event.get("actor_upn") or "unknown"
query = {
"rule_id": str(rule.get("_id")),
"actor": actor,
"timestamp": {"$gte": cutoff},
}
return alerts_collection.count_documents(query, limit=1) > 0
def _create_alert(rule: dict, event: dict):
actor = event.get("actor_display") or event.get("actor_upn") or "unknown"
alert = {
"timestamp": datetime.now(UTC).isoformat(),
"rule_id": str(rule.get("_id")),
@@ -72,10 +108,177 @@ def _create_alert(rule: dict, event: dict):
"severity": rule.get("severity", "medium"),
"event_id": event.get("id"),
"event_dedupe_key": event.get("dedupe_key"),
"actor": actor,
"message": rule.get("message", f"Rule '{rule.get('name')}' triggered"),
"status": "open", # open | acknowledged | resolved | false_positive
}
try:
alerts_collection.insert_one(alert)
logger.info("Alert created", rule=rule.get("name"), event_id=event.get("id"))
except Exception as exc:
logger.warning("Failed to create alert", error=str(exc))
return
# Send notification
if ALERT_WEBHOOK_URL:
try:
from notifications import send_notification
send_notification(
webhook_url=ALERT_WEBHOOK_URL,
format_type=ALERT_WEBHOOK_FORMAT,
rule_name=rule.get("name", "Unnamed rule"),
severity=rule.get("severity", "medium"),
message=rule.get("message", ""),
event=event,
)
except Exception as exc:
logger.warning("Failed to send notification", error=str(exc))
def seed_default_rules():
"""Upsert pre-built admin-ops rule templates. Safe for concurrent startup."""
# One-time cleanup: remove duplicates by name, keep the oldest (_id ascending)
pipeline = [
{"$sort": {"_id": ASCENDING}},
{"$group": {"_id": "$name", "first_id": {"$first": "$_id"}}},
]
seen = {doc["_id"]: doc["first_id"] for doc in rules_collection.aggregate(pipeline)}
for name, keep_id in seen.items():
rules_collection.delete_many({"name": name, "_id": {"$ne": keep_id}})
defaults = [
{
"name": "Failed Conditional Access",
"enabled": True,
"severity": "high",
"message": (
"A Conditional Access policy evaluation failed. "
"This may indicate a sign-in risk or policy misconfiguration."
),
"conditions": [
{"field": "service", "op": "eq", "value": "Directory"},
{"field": "operation", "op": "contains", "value": "ConditionalAccess"},
{"field": "result", "op": "neq", "value": "success"},
],
},
{
"name": "After-Hours Admin Activity",
"enabled": True,
"severity": "medium",
"message": "A privileged operation was performed outside business hours (9 AM 5 PM).",
"conditions": [
{
"field": "service",
"op": "in",
"value": ["Directory", "UserManagement", "GroupManagement", "RoleManagement"],
},
{"field": "timestamp", "op": "after_hours"},
],
},
{
"name": "New Application Registration",
"enabled": True,
"severity": "medium",
"message": (
"A new application was registered in Entra ID. Review for shadow IT or unauthorized integrations."
),
"conditions": [
{"field": "service", "op": "eq", "value": "ApplicationManagement"},
{"field": "operation", "op": "contains", "value": "Add application"},
],
},
{
"name": "Admin Role Assignment",
"enabled": True,
"severity": "high",
"message": "A user was assigned an administrative role. Verify this was expected and authorized.",
"conditions": [
{"field": "service", "op": "eq", "value": "RoleManagement"},
{"field": "operation", "op": "contains", "value": "Add member to role"},
],
},
{
"name": "License Change",
"enabled": True,
"severity": "low",
"message": "A license was assigned or removed from a user. Monitor for unexpected cost changes.",
"conditions": [
{"field": "service", "op": "eq", "value": "License"},
],
},
{
"name": "Bulk User Deletion",
"enabled": True,
"severity": "high",
"message": (
"Multiple users were deleted in a short window. "
"This may indicate a compromised admin account or cleanup activity."
),
"conditions": [
{"field": "service", "op": "in", "value": ["Directory", "UserManagement"]},
{"field": "operation", "op": "contains", "value": "Delete user"},
],
},
{
"name": "Device Compliance Failure",
"enabled": True,
"severity": "medium",
"message": (
"A device failed compliance evaluation. "
"It may no longer meet your organization's security requirements."
),
"conditions": [
{"field": "service", "op": "eq", "value": "Intune"},
{"field": "operation", "op": "contains", "value": "compliance"},
{"field": "result", "op": "neq", "value": "success"},
],
},
{
"name": "Exchange Transport Rule Change",
"enabled": True,
"severity": "high",
"message": "An Exchange transport rule was modified. This could affect mail flow or security filtering.",
"conditions": [
{"field": "service", "op": "eq", "value": "Exchange"},
{"field": "operation", "op": "contains", "value": "Transport rule"},
],
},
{
"name": "Service Principal Credential Added",
"enabled": True,
"severity": "high",
"message": "A new secret or certificate was added to a service principal. Verify this was expected.",
"conditions": [
{"field": "service", "op": "eq", "value": "ApplicationManagement"},
{"field": "operation", "op": "contains", "value": "Add service principal credentials"},
],
},
{
"name": "External Sharing Enabled",
"enabled": True,
"severity": "medium",
"message": (
"External sharing settings were modified on a SharePoint site or team. Review for data exposure risk."
),
"conditions": [
{"field": "service", "op": "in", "value": ["SharePoint", "Teams"]},
{"field": "operation", "op": "contains", "value": "Sharing"},
],
},
]
inserted = 0
for rule in defaults:
try:
result = rules_collection.replace_one(
{"name": rule["name"]},
rule,
upsert=True,
)
if result.upserted_id:
inserted += 1
except Exception as exc:
logger.warning("Failed to seed rule", rule=rule["name"], error=str(exc))
if inserted:
logger.info("Default admin-ops rules seeded", inserted=inserted, total=len(defaults))

View File

@@ -22,15 +22,23 @@ def mock_watermarks_collection():
@pytest.fixture(scope="function")
def client(mock_events_collection, mock_watermarks_collection, monkeypatch):
monkeypatch.setattr("database.events_collection", mock_events_collection)
monkeypatch.setattr("database.saved_searches_collection", mock_events_collection)
monkeypatch.setattr("routes.fetch.events_collection", mock_events_collection)
monkeypatch.setattr("routes.events.events_collection", mock_events_collection)
monkeypatch.setattr("routes.ask.events_collection", mock_events_collection)
monkeypatch.setattr("routes.saved_searches.saved_searches_collection", mock_events_collection)
monkeypatch.setattr("watermark.watermarks_collection", mock_watermarks_collection)
monkeypatch.setattr("routes.health.watermarks_collection", mock_watermarks_collection)
monkeypatch.setattr("routes.fetch.get_watermark", lambda source: None)
monkeypatch.setattr("routes.fetch.set_watermark", lambda source, ts: None)
monkeypatch.setattr("auth.AUTH_ENABLED", False)
monkeypatch.setattr("routes.mcp.AUTH_ENABLED", False)
monkeypatch.setattr("config.PRIVACY_SERVICES", set())
monkeypatch.setattr("config.PRIVACY_SENSITIVE_OPERATIONS", set())
monkeypatch.setattr("routes.events.PRIVACY_SERVICES", set())
monkeypatch.setattr("routes.events.PRIVACY_SENSITIVE_OPERATIONS", set())
monkeypatch.setattr("routes.ask.PRIVACY_SERVICES", set())
monkeypatch.setattr("routes.ask.PRIVACY_SENSITIVE_OPERATIONS", set())
monkeypatch.setattr("database.db.command", lambda cmd: {"ok": 1} if cmd == "ping" else {})
# Mock audit trail and rules collections so tests don't wait on real MongoDB
@@ -41,6 +49,21 @@ def client(mock_events_collection, mock_watermarks_collection, monkeypatch):
monkeypatch.setattr("rules.rules_collection", audit_db["alert_rules"])
monkeypatch.setattr("routes.rules.rules_collection", audit_db["alert_rules"])
# Mock Redis so tests don't require a running Redis server
class FakeRedis:
async def get(self, key):
return None
async def setex(self, key, ttl, value):
pass
async def fake_get_arq_pool():
return FakeRedis()
monkeypatch.setattr("redis_client.get_arq_pool", fake_get_arq_pool)
monkeypatch.setattr("routes.ask.get_arq_pool", fake_get_arq_pool)
monkeypatch.setattr("routes.jobs.get_redis", fake_get_arq_pool)
from main import app
return TestClient(app)

View File

@@ -89,6 +89,18 @@ def test_explain_event_with_llm_mock(client, mock_events_collection, monkeypatch
monkeypatch.setattr("routes.ask._explain_event", fake_explain)
class FakeRedis:
async def get(self, key):
return None
async def setex(self, key, ttl, value):
pass
async def fake_get_arq_pool():
return FakeRedis()
monkeypatch.setattr("routes.ask.get_arq_pool", fake_get_arq_pool)
mock_events_collection.insert_one(
{
"id": "evt-explain2",
@@ -107,6 +119,146 @@ def test_explain_event_with_llm_mock(client, mock_events_collection, monkeypatch
assert data["llm_used"] is True
def test_saved_searches_crud(client, monkeypatch):
monkeypatch.setattr("auth.AUTH_ENABLED", False)
# Create
response = client.post(
"/api/saved-searches", json={"name": "Test search", "filters": {"actor": "alice", "result": "success"}}
)
assert response.status_code == 200
created = response.json()
assert created["name"] == "Test search"
assert created["filters"]["actor"] == "alice"
search_id = created["id"]
# List
response2 = client.get("/api/saved-searches")
assert response2.status_code == 200
items = response2.json()
assert len(items) == 1
assert items[0]["name"] == "Test search"
# Delete
response3 = client.delete(f"/api/saved-searches/{search_id}")
assert response3.status_code == 200
# List empty
response4 = client.get("/api/saved-searches")
assert response4.status_code == 200
assert len(response4.json()) == 0
def test_saved_searches_delete_not_found(client, monkeypatch):
monkeypatch.setattr("auth.AUTH_ENABLED", False)
response = client.delete("/api/saved-searches/nonexistent")
assert response.status_code == 404
def test_saved_searches_create_validation(client, monkeypatch):
monkeypatch.setattr("auth.AUTH_ENABLED", False)
response = client.post("/api/saved-searches", json={"name": " ", "filters": {}})
assert response.status_code == 400
def test_privacy_filtering_events_by_operation(client, mock_events_collection, monkeypatch):
monkeypatch.setattr("config.PRIVACY_SENSITIVE_OPERATIONS", {"MailItemsAccessed", "Send"})
monkeypatch.setattr("routes.events.PRIVACY_SENSITIVE_OPERATIONS", {"MailItemsAccessed", "Send"})
monkeypatch.setattr("auth.PRIVACY_SERVICE_ROLES", {"SecurityAdmin"})
monkeypatch.setattr("auth.user_can_access_privacy_services", lambda claims: False)
monkeypatch.setattr("routes.events.user_can_access_privacy_services", lambda claims: False)
mock_events_collection.insert_one(
{
"id": "evt-safe",
"timestamp": datetime.now(UTC).isoformat(),
"service": "Exchange",
"operation": "Add-MailboxPermission",
"result": "success",
"actor_display": "Alice",
"raw_text": "",
}
)
mock_events_collection.insert_one(
{
"id": "evt-priv",
"timestamp": datetime.now(UTC).isoformat(),
"service": "Exchange",
"operation": "Send",
"result": "success",
"actor_display": "Bob",
"raw_text": "",
}
)
response = client.get("/api/events")
assert response.status_code == 200
data = response.json()
ids = [e["id"] for e in data["items"]]
assert "evt-safe" in ids
assert "evt-priv" not in ids
def test_privacy_filter_options_shows_service_hides_ops(client, mock_events_collection, monkeypatch):
monkeypatch.setattr("config.PRIVACY_SENSITIVE_OPERATIONS", {"MailItemsAccessed"})
monkeypatch.setattr("routes.events.PRIVACY_SENSITIVE_OPERATIONS", {"MailItemsAccessed"})
monkeypatch.setattr("auth.PRIVACY_SERVICE_ROLES", {"SecurityAdmin"})
monkeypatch.setattr("auth.user_can_access_privacy_services", lambda claims: False)
monkeypatch.setattr("routes.events.user_can_access_privacy_services", lambda claims: False)
mock_events_collection.insert_one(
{
"id": "evt-1",
"timestamp": datetime.now(UTC).isoformat(),
"service": "Exchange",
"operation": "MailItemsAccessed",
"result": "success",
"actor_display": "Alice",
"raw_text": "",
}
)
mock_events_collection.insert_one(
{
"id": "evt-2",
"timestamp": datetime.now(UTC).isoformat(),
"service": "Exchange",
"operation": "Add-MailboxPermission",
"result": "success",
"actor_display": "Bob",
"raw_text": "",
}
)
response = client.get("/api/filter-options")
assert response.status_code == 200
data = response.json()
assert "Exchange" in data["services"]
assert "MailItemsAccessed" not in data["operations"]
assert "Add-MailboxPermission" in data["operations"]
def test_privacy_explain_forbidden_by_operation(client, mock_events_collection, monkeypatch):
monkeypatch.setattr("config.PRIVACY_SENSITIVE_OPERATIONS", {"Send"})
monkeypatch.setattr("routes.ask.PRIVACY_SENSITIVE_OPERATIONS", {"Send"})
monkeypatch.setattr("auth.PRIVACY_SERVICE_ROLES", {"SecurityAdmin"})
monkeypatch.setattr("auth.user_can_access_privacy_services", lambda claims: False)
monkeypatch.setattr("routes.ask.user_can_access_privacy_services", lambda claims: False)
mock_events_collection.insert_one(
{
"id": "evt-send",
"timestamp": datetime.now(UTC).isoformat(),
"service": "Exchange",
"operation": "Send",
"result": "success",
"actor_display": "Bob",
"raw_text": "",
}
)
response = client.post("/api/events/evt-send/explain")
assert response.status_code == 403
def test_health(client):
response = client.get("/health")
assert response.status_code == 200

View File

@@ -1,5 +1,7 @@
import asyncio
from datetime import UTC, datetime, timedelta
from jobs import set_cached_ask
from routes.ask import _build_event_query, _extract_entity, _extract_time_range
# ---------------------------------------------------------------------------
@@ -350,3 +352,131 @@ class TestAskEndpoint:
data = response.json()
assert data["query_info"]["event_count"] == 1
assert data["events"][0]["id"] == "evt-bob"
class TestAskCaching:
def test_ask_cache_hit_returns_cached_answer(self, client, mock_events_collection, monkeypatch):
"""If the answer is cached, the LLM should not be called."""
now = datetime.now(UTC)
mock_events_collection.insert_one(
{
"id": "evt-cache",
"timestamp": now.isoformat(),
"service": "Directory",
"operation": "Add user",
"result": "success",
"actor_display": "Alice",
"target_displays": ["USER-001"],
"display_summary": "summary",
"raw_text": "raw",
}
)
llm_called = False
async def fake_llm(question, events, total=None, excluded_services=None):
nonlocal llm_called
llm_called = True
return "This should NOT appear."
monkeypatch.setattr("routes.ask.LLM_API_KEY", "fake-key")
monkeypatch.setattr("routes.ask._call_llm", fake_llm)
# Pre-populate cache with a specific answer
class CachingFakeRedis:
def __init__(self):
self.store = {}
async def get(self, key):
return self.store.get(key)
async def setex(self, key, ttl, value):
self.store[key] = value
redis = CachingFakeRedis()
# Seed cache with the exact filters the endpoint will generate
filters_snapshot = {
"services": None,
"actor": None,
"operation": None,
"result": None,
"start": None,
"end": None,
"include_tags": None,
"exclude_tags": None,
}
asyncio.run(
set_cached_ask(
redis,
"What happened to USER-001?",
filters_snapshot,
[{"id": "evt-cache"}],
{"answer": "Cached answer!", "llm_used": True, "llm_error": None},
)
)
async def fake_get_arq_pool():
return redis
monkeypatch.setattr("routes.ask.get_arq_pool", fake_get_arq_pool)
response = client.post("/api/ask", json={"question": "What happened to USER-001?"})
assert response.status_code == 200
data = response.json()
assert data["answer"] == "Cached answer!"
assert data["llm_used"] is True
assert llm_called is False
def test_ask_async_mode_returns_job_id(self, client, mock_events_collection, monkeypatch):
"""Async mode should return immediately with a job_id."""
now = datetime.now(UTC)
mock_events_collection.insert_one(
{
"id": "evt-async",
"timestamp": now.isoformat(),
"service": "Directory",
"operation": "Add user",
"result": "success",
"actor_display": "Alice",
"target_displays": ["USER-001"],
"display_summary": "summary",
"raw_text": "raw",
}
)
monkeypatch.setattr("routes.ask.LLM_API_KEY", "fake-key")
# Mock arq pool to capture enqueue_job call
class FakeArqPool:
def __init__(self):
self.enqueued = []
async def get(self, key):
return None
async def setex(self, key, ttl, value):
pass
async def enqueue_job(self, func, *args, **kwargs):
from unittest.mock import MagicMock
job = MagicMock()
job.job_id = "job-12345"
self.enqueued.append((func, args, kwargs))
return job
pool = FakeArqPool()
async def fake_get_arq_pool():
return pool
monkeypatch.setattr("routes.ask.get_arq_pool", fake_get_arq_pool)
response = client.post("/api/ask", json={"question": "What happened to USER-001?", "async_mode": True})
assert response.status_code == 200
data = response.json()
assert data["job_id"] == "job-12345"
assert data["llm_used"] is False
assert "being processed" in data["answer"]
assert len(pool.enqueued) == 1
assert pool.enqueued[0][0] == "process_ask_question"

View File

@@ -59,6 +59,7 @@ def test_evaluate_event_creates_alert(monkeypatch):
inserted["doc"] = doc
monkeypatch.setattr(alerts_collection, "insert_one", mock_insert)
monkeypatch.setattr(alerts_collection, "count_documents", lambda *args, **kwargs: 0)
event = {"id": "e1", "operation": "Add user", "timestamp": datetime.now(UTC).isoformat(), "dedupe_key": "dk1"}
triggered = evaluate_event(event)

View File

@@ -1,4 +1,19 @@
services:
redis:
image: valkey/valkey:8-alpine
container_name: aoc-redis
restart: always
volumes:
- redis_data:/data
networks:
- aoc-internal
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 3s
retries: 5
start_period: 5s
mongo:
image: mongo:7
container_name: aoc-mongo
@@ -27,9 +42,12 @@ services:
- .env
environment:
MONGO_URI: mongodb://${MONGO_ROOT_USERNAME}:${MONGO_ROOT_PASSWORD}@mongo:27017/
REDIS_URL: redis://redis:6379/0
depends_on:
mongo:
condition: service_healthy
redis:
condition: service_healthy
networks:
- aoc-internal
healthcheck:
@@ -39,6 +57,24 @@ services:
retries: 3
start_period: 10s
worker:
image: git.cqre.net/cqrenet/aoc-backend:${AOC_VERSION:-latest}
container_name: aoc-worker
restart: always
env_file:
- .env
environment:
MONGO_URI: mongodb://${MONGO_ROOT_USERNAME}:${MONGO_ROOT_PASSWORD}@mongo:27017/
REDIS_URL: redis://redis:6379/0
command: ["arq", "jobs.WorkerSettings"]
depends_on:
redis:
condition: service_healthy
mongo:
condition: service_healthy
networks:
- aoc-internal
nginx:
image: nginx:alpine
container_name: aoc-nginx
@@ -58,6 +94,7 @@ services:
volumes:
mongo_data:
redis_data:
networks:
aoc-internal:

View File

@@ -1,4 +1,13 @@
services:
redis:
image: valkey/valkey:8-alpine
container_name: aoc-redis
restart: always
ports:
- "6379:6379"
volumes:
- redis_data:/data
mongo:
image: mongo:7
container_name: aoc-mongo
@@ -21,10 +30,27 @@ services:
- .env
environment:
MONGO_URI: mongodb://${MONGO_ROOT_USERNAME}:${MONGO_ROOT_PASSWORD}@mongo:${MONGO_PORT}/
REDIS_URL: redis://redis:6379/0
depends_on:
- mongo
- redis
ports:
- "8000:8000"
worker:
build: ./backend
container_name: aoc-worker
restart: always
env_file:
- .env
environment:
MONGO_URI: mongodb://${MONGO_ROOT_USERNAME}:${MONGO_ROOT_PASSWORD}@mongo:${MONGO_PORT}/
REDIS_URL: redis://redis:6379/0
command: ["arq", "jobs.WorkerSettings"]
depends_on:
- redis
- mongo
volumes:
mongo_data:
redis_data: