19 Commits

Author SHA1 Message Date
e2cea50d87 hotfix(v1.7.9): auth diagnostics and rate-limit exemptions
All checks were successful
CI / lint-and-test (push) Successful in 2m30s
Release / build-and-push (push) Successful in 4m46s
- Exempt /api/config/auth, /api/config/features, /health, /metrics from rate limiting
- Fix generic exception handler to return proper JSON for HTTPException instead of re-raising
- Add startup log with auth_enabled and version
- Add frontend console logging for auth config fetch errors
- Show 'Auth: OFF' or 'Auth: misconfigured' on auth button instead of empty text
- Add backend debug logging to /api/config/auth endpoint
2026-04-27 10:09:44 +02:00
7fe53f882a hotfix(v1.7.8): restore CORS wildcard and fix CSP for MSAL auth
All checks were successful
CI / lint-and-test (push) Successful in 51s
Release / build-and-push (push) Successful in 2m4s
- Revert automatic CORS wildcard stripping that broke production deployments
  with CORS_ORIGINS=* (now logs a warning but preserves the config)
- Expand CSP headers to allow MSAL auth flows:
  - connect-src: login.microsoftonline.com
  - frame-src: login.microsoftonline.com
  - form-action: login.microsoftonline.com
2026-04-27 09:41:28 +02:00
d01e7801ed security: v1.7.7 hardening release
All checks were successful
CI / lint-and-test (push) Successful in 51s
Release / build-and-push (push) Successful in 1m57s
- Add WEBHOOK_CLIENT_SECRET validation for Graph webhooks
- Add Redis-backed rate limiting (fetch/ask/write/default tiers)
- Validate LLM_BASE_URL to prevent SSRF (HTTPS only, block private IPs)
- Enforce non-wildcard CORS when AUTH_ENABLED=true
- Add Content-Security-Policy headers
- Fix audit middleware to use verified JWT claims via contextvars
- Cap bulk_tags updates to 10,000 documents
- Return generic error messages to clients (no internal detail leakage)
- Strict AlertCondition Pydantic model for alert rules
- Security warning on MCP stdio server startup
- Remove MongoDB/Redis host ports from docker-compose
- Remove mongo_query from /ask API response
2026-04-27 09:16:57 +02:00
7cd7709b4a fix: dedupe alert_rules before creating unique index in setup_indexes()
All checks were successful
CI / lint-and-test (push) Successful in 1m7s
Release / build-and-push (push) Successful in 2m25s
The unique index on alert_rules.name was being created before duplicates
were cleaned up, causing DuplicateKeyError on startup when existing
duplicates were present. Move deduplication into setup_indexes() so it
runs before the unique index is created.

v1.7.6
2026-04-22 15:20:19 +02:00
9cd50d1257 chore: bump version to 1.7.5
All checks were successful
CI / lint-and-test (push) Successful in 30s
Release / build-and-push (push) Successful in 1m29s
2026-04-22 15:13:55 +02:00
646d61f72e fix: dedupe existing rules + unique index to prevent duplicates
- Add unique index on alert_rules.name in setup_indexes()
- seed_default_rules() now removes duplicates by name before upserting
- Keeps the oldest document (_id ascending) when deduping
2026-04-22 15:13:41 +02:00
5f7a98f21c chore: bump version to 1.7.4
All checks were successful
CI / lint-and-test (push) Successful in 28s
Release / build-and-push (push) Successful in 1m30s
2026-04-22 14:57:06 +02:00
19ed231a31 fix: prevent duplicate default rules on multi-worker startup
- Replace insert_many with replace_one(..., upsert=True) keyed by rule name
- Safe for concurrent startup with multiple gunicorn workers
2026-04-22 14:56:53 +02:00
f812fda150 chore: bump version to 1.7.3
All checks were successful
CI / lint-and-test (push) Successful in 44s
Release / build-and-push (push) Successful in 1m40s
2026-04-22 14:48:17 +02:00
a194c78c59 feat: all panels are now collapsible
- Source Health, Alerts, Alert Rules, Filters, Ask, Events panels all collapsible
- Click panel header to expand/collapse
- Chevron indicator rotates to show state
- Collapsed state persisted to localStorage (aoc_panels key)
2026-04-22 14:48:03 +02:00
e984899d4c chore: bump version to 1.7.2
All checks were successful
Release / build-and-push (push) Successful in 1m39s
CI / lint-and-test (push) Successful in 43s
2026-04-22 14:43:13 +02:00
b618cb29ea feat: alert rules management UI
- Add Alert Rules panel between Alerts and Filters sections
- List all rules with severity badge, on/off toggle, conditions preview
- Add Rule button opens modal with form for name, severity, message, conditions
- Edit existing rules inline
- Delete rules with confirmation
- Condition builder supports eq, neq, contains, in, after_hours operators
2026-04-22 14:42:58 +02:00
3e1416cd52 chore: bump version to 1.7.1
All checks were successful
CI / lint-and-test (push) Successful in 31s
Release / build-and-push (push) Successful in 1m32s
2026-04-22 14:21:46 +02:00
94983c43e9 fix: alert panel always visible, version display normalization
- Remove x-show condition hiding alert panel when no alerts exist
- Add empty state message explaining alerts appear on rule triggers
- Normalize appVersion in loadVersion() to strip leading 'v' (prevents vv1.7.0 in footer)
2026-04-22 14:21:34 +02:00
0a16cf6870 chore: bump version to 1.7.0
All checks were successful
CI / lint-and-test (push) Successful in 26s
Release / build-and-push (push) Successful in 1m15s
2026-04-22 14:12:49 +02:00
e348881083 feat: Admin Operations SIEM — alerts, notifications, pre-built rules
- Add pluggable notification system (webhook, Slack, Teams) with retry
- Add alert deduplication: same rule + actor within 15 min = one alert
- Add 10 pre-built admin-ops rule templates seeded on startup:
  - Failed Conditional Access, After-Hours Admin Activity
  - New Application Registration, Admin Role Assignment
  - License Change, Bulk User Deletion
  - Device Compliance Failure, Exchange Transport Rule Change
  - Service Principal Credential Added, External Sharing Enabled
- Add /api/alerts, /api/alerts/{id}/status, /api/alerts/summary endpoints
- Add alert dashboard to frontend with status filters and ack/resolve buttons
- Add alert summary badge in hero header (high/medium/low counts)
- New env vars: ALERT_WEBHOOK_URL, ALERT_WEBHOOK_FORMAT, ALERT_DEDUPE_MINUTES
2026-04-22 14:12:36 +02:00
a220494bcf docs: add Phase 6 multi-tenancy plan to roadmap
All checks were successful
CI / lint-and-test (push) Successful in 43s
- Row-level isolation architecture
- Per-tenant Entra + Graph credentials
- License-gated premium feature
- Deferred until SIEM export and alerting are production-tested
2026-04-22 13:49:56 +02:00
5bda1dd616 chore: bump version to 1.6.4
All checks were successful
CI / lint-and-test (push) Successful in 25s
Release / build-and-push (push) Successful in 1m29s
2026-04-22 12:16:32 +02:00
3e333291c6 fix: revert to single-click service filter, show all services by default, page size 24
- Revert +/- buttons on service pills back to single-click = filter only this service
- Remove default exclusion of Exchange/SharePoint/Teams (privacy controls handle this server-side)
- Change default page size from 25 to 24 (divisible by 3 for the 3-column grid)
- Update DEFAULT_PAGE_SIZE config default to 24
2026-04-22 12:16:20 +02:00
23 changed files with 1511 additions and 99 deletions

View File

@@ -56,7 +56,17 @@ LLM_API_VERSION=
REDIS_URL=redis://localhost:6379/0 REDIS_URL=redis://localhost:6379/0
# UI default page size (number of events shown per page) # UI default page size (number of events shown per page)
DEFAULT_PAGE_SIZE=25 DEFAULT_PAGE_SIZE=24
# Alert notifications (optional)
# Send triggered admin-ops alerts to a webhook (Slack, Teams, or generic)
ALERT_WEBHOOK_URL=
ALERT_WEBHOOK_FORMAT=generic # generic | slack | teams
ALERT_DEDUPE_MINUTES=15
# Webhook security (optional but strongly recommended)
# Set this to the same clientState used when creating Graph subscriptions
WEBHOOK_CLIENT_SECRET=
# Optional: privacy / access control # Optional: privacy / access control
# Hide entire services from users without PRIVACY_SERVICE_ROLES # Hide entire services from users without PRIVACY_SERVICE_ROLES

99
RELEASE_NOTES_v1.7.7.md Normal file
View File

@@ -0,0 +1,99 @@
# AOC v1.7.7 Release Notes
**Release date:** 2026-04-24
---
## Security Hardening
This release is a focused security patch addressing findings from an internal audit. All users running AOC in production are encouraged to upgrade.
### Webhook authentication (`/api/webhooks/graph`)
- **ClientState validation** — Notifications now require a matching `WEBHOOK_CLIENT_SECRET`. Set this in your `.env` to the same value used when creating Graph subscriptions.
- Rejects spoofed notification payloads with `401 Unauthorized`.
### Rate limiting
- **Redis-backed fixed-window rate limiting** is now enabled by default.
- Per-category limits:
- `/api/fetch-audit-logs` — 10 requests/hour
- `/api/ask` — 30 requests/minute
- `/api/events/bulk-tags` — 20 requests/minute
- All other endpoints — 120 requests/minute
- Returns `429 Too Many Requests` with a `Retry-After` header when exceeded.
### SSRF protection for LLM calls
- `LLM_BASE_URL` is now validated before every outbound request.
- Blocks non-HTTPS URLs, localhost, link-local addresses (`169.254.169.254`), and all private IP ranges.
### CORS enforcement
- Wildcard (`*`) origins are **automatically stripped** when `AUTH_ENABLED=true`.
- A startup warning is logged if an insecure CORS configuration is detected.
### Content Security Policy
- API and HTML responses now include a `Content-Security-Policy` header.
- Restricts script sources to self, CDN origins, and MSAL auth library.
### Audit trail integrity
- The audit middleware no longer parses JWT tokens without signature verification.
- Verified claims are now propagated safely via `contextvars`, eliminating audit log poisoning.
### Standalone MCP server
- Prints a prominent security warning on startup reminding operators that the stdio transport has no authentication layer.
---
## Operational Improvements
### Bulk tag cap
- `POST /api/events/bulk-tags` now refuses to update more than **10,000 events** in a single request.
- Returns `400` with guidance to narrow filters.
### Generic error responses
- Internal exception details are no longer leaked in HTTP 500/502 responses.
- Full stack traces remain in server-side logs.
### Alert rule schema
- `conditions` field now uses a strict Pydantic model (`AlertCondition`) instead of an unconstrained `list[dict]`.
- Prevents stored data pollution from malformed rule payloads.
### Docker Compose
- MongoDB (`27017`) and Redis (`6379`) ports are no longer forwarded to the Docker host.
- Internal services are reachable only via the Docker network.
---
## Configuration
Add to your `.env`:
```bash
# Required if you use Graph webhooks
WEBHOOK_CLIENT_SECRET=your-random-secret
# Optional: disable rate limiting (not recommended)
RATE_LIMIT_ENABLED=true
RATE_LIMIT_REQUESTS=120
RATE_LIMIT_WINDOW_SECONDS=60
```
---
## Upgrade notes
**No breaking changes.** Existing event data, tags, comments, and saved searches are preserved.
After pulling:
```bash
export AOC_VERSION=v1.7.7
docker compose -f docker-compose.prod.yml pull
docker compose -f docker-compose.prod.yml up -d
```
---
## Docker image
```
git.cqre.net/cqrenet/aoc-backend:v1.7.7
```

View File

@@ -72,3 +72,32 @@ Goal: add AI-powered analysis and external tool integration.
## Completed in this PR ## Completed in this PR
All Phase 5 items marked done were implemented in v1.3.0v1.5.0. All Phase 5 items marked done were implemented in v1.3.0v1.5.0.
Redis caching + async queue implemented in v1.6.0, switched to Valkey. Redis caching + async queue implemented in v1.6.0, switched to Valkey.
UI polish (topbar, footer, clickable pills) in v1.6.1v1.6.4.
---
## Phase 6: Multi-Tenancy (Premium) ⏸️
Goal: allow MSPs to manage multiple client tenants from a single deployment.
Status: **Planned — not started**. Architecture designed, pending validation of core features (SIEM export, alerting) in production first.
### Architecture
- Row-level isolation: `tenant_id` field on every MongoDB document
- Each tenant has their own Microsoft Entra tenant + app registration credentials
- Auth: user's JWT `tid` claim maps to tenant config automatically
- Super-admin role for MSP staff to access all tenants
### Implementation phases
- **Phase 6.1** (23 days): Tenant model & registry, tenant-aware data layer, per-tenant Graph API auth
- **Phase 6.2** (1 day): Tenant-scoped API routes, tenant-specific config endpoints
- **Phase 6.3** (2 days): Frontend tenant switcher, tenant name display, admin page
- **Phase 6.4** (1 day): License gating — signed JWT `LICENSE_KEY` gates multi-tenant mode
### Licensing model
- Single-tenant: remains MIT/free
- Multi-tenant: premium feature requiring a signed license key
- License key is a JWT with claims: `plan`, `max_tenants`, `exp`, `features`
- Offline license generation tool included
### Effort estimate
~79 days total. Deferred until SIEM export and alerting are battle-tested.

View File

@@ -1 +1 @@
1.6.3 1.7.9

View File

@@ -1,3 +1,4 @@
import contextvars
import time import time
import requests import requests
@@ -15,6 +16,9 @@ from fastapi import Header, HTTPException
from jwt import ExpiredSignatureError, InvalidTokenError, decode from jwt import ExpiredSignatureError, InvalidTokenError, decode
from jwt.algorithms import RSAAlgorithm from jwt.algorithms import RSAAlgorithm
# Thread-/task-local storage for verified auth claims (used by audit middleware)
_auth_context: contextvars.ContextVar[dict | None] = contextvars.ContextVar("auth_context", default=None)
JWKS_CACHE = {"exp": 0, "keys": []} JWKS_CACHE = {"exp": 0, "keys": []}
logger = structlog.get_logger("aoc.auth") logger = structlog.get_logger("aoc.auth")
@@ -94,7 +98,9 @@ def user_can_access_privacy_services(claims: dict) -> bool:
def require_auth(authorization: str | None = Header(None)): def require_auth(authorization: str | None = Header(None)):
if not AUTH_ENABLED: if not AUTH_ENABLED:
return {"sub": "anonymous"} user = {"sub": "anonymous"}
_auth_context.set(user)
return user
if not authorization or not authorization.lower().startswith("bearer "): if not authorization or not authorization.lower().startswith("bearer "):
raise HTTPException(status_code=401, detail="Missing bearer token") raise HTTPException(status_code=401, detail="Missing bearer token")
@@ -106,4 +112,5 @@ def require_auth(authorization: str | None = Header(None)):
if not _allowed(claims, AUTH_ALLOWED_ROLES, AUTH_ALLOWED_GROUPS): if not _allowed(claims, AUTH_ALLOWED_ROLES, AUTH_ALLOWED_GROUPS):
raise HTTPException(status_code=403, detail="Forbidden") raise HTTPException(status_code=403, detail="Forbidden")
_auth_context.set(claims)
return claims return claims

View File

@@ -61,7 +61,20 @@ class Settings(BaseSettings):
REDIS_URL: str = "redis://localhost:6379/0" REDIS_URL: str = "redis://localhost:6379/0"
# UI defaults # UI defaults
DEFAULT_PAGE_SIZE: int = 25 DEFAULT_PAGE_SIZE: int = 24
# Alert notifications
ALERT_WEBHOOK_URL: str = ""
ALERT_WEBHOOK_FORMAT: str = "generic" # generic | slack | teams
ALERT_DEDUPE_MINUTES: int = 15
# Webhook security
WEBHOOK_CLIENT_SECRET: str = ""
# Rate limiting
RATE_LIMIT_ENABLED: bool = True
RATE_LIMIT_REQUESTS: int = 120
RATE_LIMIT_WINDOW_SECONDS: int = 60
_settings = Settings() _settings = Settings()
@@ -104,3 +117,13 @@ PRIVACY_SERVICE_ROLES = {r.strip() for r in _settings.PRIVACY_SERVICE_ROLES.spli
REDIS_URL = _settings.REDIS_URL REDIS_URL = _settings.REDIS_URL
DEFAULT_PAGE_SIZE = _settings.DEFAULT_PAGE_SIZE DEFAULT_PAGE_SIZE = _settings.DEFAULT_PAGE_SIZE
ALERT_WEBHOOK_URL = _settings.ALERT_WEBHOOK_URL
ALERT_WEBHOOK_FORMAT = _settings.ALERT_WEBHOOK_FORMAT
ALERT_DEDUPE_MINUTES = _settings.ALERT_DEDUPE_MINUTES
WEBHOOK_CLIENT_SECRET = _settings.WEBHOOK_CLIENT_SECRET
RATE_LIMIT_ENABLED = _settings.RATE_LIMIT_ENABLED
RATE_LIMIT_REQUESTS = _settings.RATE_LIMIT_REQUESTS
RATE_LIMIT_WINDOW_SECONDS = _settings.RATE_LIMIT_WINDOW_SECONDS

View File

@@ -8,9 +8,24 @@ client = MongoClient(MONGO_URI or "mongodb://localhost:27017")
db = client[DB_NAME] db = client[DB_NAME]
events_collection = db["events"] events_collection = db["events"]
saved_searches_collection = db["saved_searches"] saved_searches_collection = db["saved_searches"]
alerts_collection = db["alerts"]
logger = structlog.get_logger("aoc.database") logger = structlog.get_logger("aoc.database")
def _dedupe_alert_rules():
"""Remove duplicate alert_rules by name, keeping the oldest document."""
try:
pipeline = [
{"$sort": {"_id": ASCENDING}},
{"$group": {"_id": "$name", "first_id": {"$first": "$_id"}}},
]
seen = {doc["_id"]: doc["first_id"] for doc in db["alert_rules"].aggregate(pipeline)}
for name, keep_id in seen.items():
db["alert_rules"].delete_many({"name": name, "_id": {"$ne": keep_id}})
except Exception:
pass # Collection may not exist yet
def setup_indexes(max_retries: int = 5, delay: float = 2.0): def setup_indexes(max_retries: int = 5, delay: float = 2.0):
"""Ensure MongoDB indexes exist. Retries on connection errors.""" """Ensure MongoDB indexes exist. Retries on connection errors."""
from time import sleep from time import sleep
@@ -22,6 +37,8 @@ def setup_indexes(max_retries: int = 5, delay: float = 2.0):
events_collection.create_index([("service", ASCENDING), ("timestamp", DESCENDING)]) events_collection.create_index([("service", ASCENDING), ("timestamp", DESCENDING)])
events_collection.create_index("id") events_collection.create_index("id")
saved_searches_collection.create_index([("created_by", ASCENDING), ("created_at", DESCENDING)]) saved_searches_collection.create_index([("created_by", ASCENDING), ("created_at", DESCENDING)])
_dedupe_alert_rules()
db["alert_rules"].create_index("name", unique=True)
events_collection.create_index( events_collection.create_index(
[("actor_display", TEXT), ("raw_text", TEXT), ("operation", TEXT)], [("actor_display", TEXT), ("raw_text", TEXT), ("operation", TEXT)],
name="text_search_index", name="text_search_index",

View File

@@ -4,7 +4,7 @@
<meta charset="UTF-8" /> <meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" /> <meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Admin Operations Center</title> <title>Admin Operations Center</title>
<link rel="stylesheet" href="/style.css?v=11" /> <link rel="stylesheet" href="/style.css?v=15" />
<script defer src="https://cdn.jsdelivr.net/npm/alpinejs@3.x.x/dist/cdn.min.js"></script> <script defer src="https://cdn.jsdelivr.net/npm/alpinejs@3.x.x/dist/cdn.min.js"></script>
<script src="https://alcdn.msauth.net/browser/2.37.0/js/msal-browser.min.js" crossorigin="anonymous"></script> <script src="https://alcdn.msauth.net/browser/2.37.0/js/msal-browser.min.js" crossorigin="anonymous"></script>
</head> </head>
@@ -47,11 +47,20 @@
<h1>Audit Log Explorer</h1> <h1>Audit Log Explorer</h1>
<p class="lede">Search and review Microsoft audit events from Entra, Intune, Exchange, SharePoint, and Teams.</p> <p class="lede">Search and review Microsoft audit events from Entra, Intune, Exchange, SharePoint, and Teams.</p>
</div> </div>
<div class="alert-summary" x-show="alertSummary.total_open > 0">
<div class="alert-badge alert-badge--high" x-show="alertSummary.high > 0" x-text="alertSummary.high"></div>
<div class="alert-badge alert-badge--medium" x-show="alertSummary.medium > 0" x-text="alertSummary.medium"></div>
<div class="alert-badge alert-badge--low" x-show="alertSummary.low > 0" x-text="alertSummary.low"></div>
<span class="alert-label">open alerts</span>
</div>
</header> </header>
<section class="panel"> <section class="panel">
<div class="panel-header panel-header--collapsible" @click="togglePanel('sourceHealth')">
<h3>Source Health</h3> <h3>Source Health</h3>
<div class="source-health"> <span class="panel-toggle" :class="panelState.sourceHealth ? 'panel-toggle--open' : ''"></span>
</div>
<div x-show="panelState.sourceHealth">
<template x-for="src in sourceHealth" :key="src.source"> <template x-for="src in sourceHealth" :key="src.source">
<div class="health-card"> <div class="health-card">
<strong x-text="src.source"></strong> <strong x-text="src.source"></strong>
@@ -65,7 +74,160 @@
</section> </section>
<section class="panel"> <section class="panel">
<form id="filters" class="filters" @submit.prevent="resetPagination(); loadEvents()"> <div class="panel-header panel-header--collapsible" @click="togglePanel('alerts')">
<h3>Alerts</h3>
<div style="display:flex;align-items:center;gap:10px;">
<span x-text="`${alertSummary.total_open} open`" class="alert-open-count"></span>
<span class="panel-toggle" :class="panelState.alerts ? 'panel-toggle--open' : ''"></span>
</div>
</div>
<div x-show="panelState.alerts">
<div class="alert-filters">
<select x-model="alertsFilter.status" @change="alertsPage = 1; loadAlerts()">
<option value="">All statuses</option>
<option value="open">Open</option>
<option value="acknowledged">Acknowledged</option>
<option value="resolved">Resolved</option>
<option value="false_positive">False Positive</option>
</select>
<select x-model="alertsFilter.severity" @change="alertsPage = 1; loadAlerts()">
<option value="">All severities</option>
<option value="high">High</option>
<option value="medium">Medium</option>
<option value="low">Low</option>
</select>
</div>
<div class="alerts-list" x-show="alerts.length > 0">
<template x-for="alert in alerts" :key="alert._id || alert.event_id">
<div class="alert-card" :class="'alert-card--' + alert.severity">
<div class="alert-card__meta">
<span class="pill" :class="alert.severity === 'high' ? 'pill--err' : (alert.severity === 'medium' ? 'pill--warn' : '')" x-text="alert.severity"></span>
<span class="pill" x-text="alert.status"></span>
<small x-text="new Date(alert.timestamp).toLocaleString()"></small>
</div>
<strong x-text="alert.rule_name"></strong>
<p x-text="alert.message"></p>
<div class="alert-card__actions">
<button type="button" class="ghost btn--compact" @click="updateAlertStatus(alert._id, 'acknowledged')" x-show="alert.status === 'open'">Acknowledge</button>
<button type="button" class="ghost btn--compact" @click="updateAlertStatus(alert._id, 'resolved')" x-show="alert.status !== 'resolved' && alert.status !== 'false_positive'">Resolve</button>
<button type="button" class="ghost btn--compact" @click="updateAlertStatus(alert._id, 'false_positive')" x-show="alert.status !== 'false_positive'">False Positive</button>
<button type="button" class="ghost btn--compact" @click="updateAlertStatus(alert._id, 'open')" x-show="alert.status !== 'open'">Reopen</button>
</div>
</div>
</template>
</div>
<div class="alerts-empty" x-show="alerts.length === 0">
<p>No alerts match the current filters. Alerts appear here when rules trigger during event ingestion.</p>
</div>
<div class="pagination" x-show="alertsTotal > 20">
<button type="button" :disabled="alertsPage === 1" @click="alertsPage--; loadAlerts()">Prev</button>
<span x-text="`Page ${alertsPage}`"></span>
<button type="button" :disabled="alertsPage * 20 >= alertsTotal" @click="alertsPage++; loadAlerts()">Next</button>
</div>
</div>
</section>
<section class="panel">
<div class="panel-header panel-header--collapsible" @click="togglePanel('rules')">
<h3>Alert Rules</h3>
<div style="display:flex;align-items:center;gap:10px;">
<button type="button" class="btn--compact" @click.stop="openRuleEditor()">+ Add rule</button>
<span class="panel-toggle" :class="panelState.rules ? 'panel-toggle--open' : ''"></span>
</div>
</div>
<div x-show="panelState.rules">
<div class="rules-list">
<template x-for="rule in rules" :key="rule.id">
<div class="rule-card" :class="rule.enabled ? '' : 'rule-card--disabled'">
<div class="rule-card__meta">
<span class="pill" :class="rule.severity === 'high' ? 'pill--err' : (rule.severity === 'medium' ? 'pill--warn' : '')" x-text="rule.severity"></span>
<label class="toggle-label">
<input type="checkbox" :checked="rule.enabled" @change="toggleRule(rule.id, $event.target.checked)">
<span x-text="rule.enabled ? 'On' : 'Off'"></span>
</label>
</div>
<strong x-text="rule.name"></strong>
<p x-text="rule.message"></p>
<div class="rule-card__conditions">
<template x-for="(cond, idx) in rule.conditions" :key="idx">
<span class="pill pill--tag" x-text="`${cond.field} ${cond.op} ${cond.value}`"></span>
</template>
</div>
<div class="rule-card__actions">
<button type="button" class="ghost btn--compact" @click="openRuleEditor(rule)">Edit</button>
<button type="button" class="ghost btn--compact" @click="deleteRule(rule.id)">Delete</button>
</div>
</div>
</template>
</div>
<div class="rules-empty" x-show="rules.length === 0">
<p>No custom rules yet. Pre-built admin-ops rules are active by default. Add your own rules to detect specific patterns.</p>
</div>
</div>
<div id="ruleModal" class="modal hidden" role="dialog" aria-modal="true" :class="{ 'hidden': !ruleModalOpen }">
<div class="modal__content" style="max-width: 600px;">
<div class="modal__header">
<h3 x-text="ruleEditId ? 'Edit Rule' : 'New Rule'"></h3>
<button type="button" class="ghost" @click="ruleModalOpen = false">Close</button>
</div>
<form class="rule-form" @submit.prevent="saveRule()">
<label>
Name
<input type="text" x-model="ruleEdit.name" placeholder="e.g. Failed CA Policy" required />
</label>
<label>
Severity
<select x-model="ruleEdit.severity">
<option value="low">Low</option>
<option value="medium">Medium</option>
<option value="high">High</option>
</select>
</label>
<label>
Message
<textarea x-model="ruleEdit.message" placeholder="What should the alert say?" rows="2"></textarea>
</label>
<div class="rule-conditions">
<span>Conditions (all must match)</span>
<template x-for="(cond, idx) in ruleEdit.conditions" :key="idx">
<div class="condition-row">
<input type="text" x-model="cond.field" placeholder="field" list="ruleFieldOptions" required />
<select x-model="cond.op">
<option value="eq">equals</option>
<option value="neq">not equals</option>
<option value="contains">contains</option>
<option value="in">in list</option>
<option value="after_hours">after hours</option>
</select>
<input type="text" x-model="cond.value" placeholder="value" :required="cond.op !== 'after_hours'" />
<button type="button" class="ghost btn--compact" @click="ruleEdit.conditions.splice(idx, 1)"></button>
</div>
</template>
<button type="button" class="ghost btn--compact" @click="ruleEdit.conditions.push({field:'', op:'eq', value:''})">+ Add condition</button>
</div>
<datalist id="ruleFieldOptions">
<option value="service"></option>
<option value="operation"></option>
<option value="result"></option>
<option value="actor_display"></option>
<option value="timestamp"></option>
</datalist>
<div class="rule-form__actions">
<button type="submit">Save</button>
<button type="button" class="ghost" @click="ruleModalOpen = false">Cancel</button>
</div>
</form>
</div>
</div>
</section>
<section class="panel">
<div class="panel-header panel-header--collapsible" @click="togglePanel('filters')">
<h3>Filters</h3>
<span class="panel-toggle" :class="panelState.filters ? 'panel-toggle--open' : ''"></span>
</div>
<form id="filters" class="filters" @submit.prevent="resetPagination(); loadEvents()" x-show="panelState.filters">
<div class="filter-row"> <div class="filter-row">
<label> <label>
User (name/UPN) User (name/UPN)
@@ -159,8 +321,11 @@
</section> </section>
<section class="panel" x-show="aiFeaturesEnabled"> <section class="panel" x-show="aiFeaturesEnabled">
<div class="panel-header panel-header--collapsible" @click="togglePanel('ask')">
<h3>Ask a question</h3> <h3>Ask a question</h3>
<form class="ask-form" @submit.prevent="askQuestion()"> <span class="panel-toggle" :class="panelState.ask ? 'panel-toggle--open' : ''"></span>
</div>
<form class="ask-form" @submit.prevent="askQuestion()" x-show="panelState.ask">
<div class="ask-row"> <div class="ask-row">
<input <input
type="text" type="text"
@@ -184,11 +349,7 @@
<template x-for="(evt, idx) in askEvents" :key="evt.id || idx"> <template x-for="(evt, idx) in askEvents" :key="evt.id || idx">
<article class="event event--compact"> <article class="event event--compact">
<div class="event__meta"> <div class="event__meta">
<span class="pill"> <span class="pill pill--clickable" x-text="evt.display_category || evt.service || '—'" @click="filterByService(evt.service || evt.display_category)" title="Filter by this service"></span>
<span x-text="evt.display_category || evt.service || '—'"></span>
<span class="pill__action" @click.stop="addServiceFilter(evt.service || evt.display_category)" title="Include this service">+</span>
<span class="pill__action" @click.stop="removeServiceFilter(evt.service || evt.display_category)" title="Exclude this service"></span>
</span>
<span class="pill pill--clickable" :class="['success','succeeded','ok','passed','true'].includes((evt.result || '').toLowerCase()) ? 'pill--ok' : 'pill--warn'" x-text="evt.result || '—'" @click="filterByResult(evt.result)" title="Filter by this result"></span> <span class="pill pill--clickable" :class="['success','succeeded','ok','passed','true'].includes((evt.result || '').toLowerCase()) ? 'pill--ok' : 'pill--warn'" x-text="evt.result || '—'" @click="filterByResult(evt.result)" title="Filter by this result"></span>
</div> </div>
<h3 x-text="evt.operation || '—'"></h3> <h3 x-text="evt.operation || '—'"></h3>
@@ -206,10 +367,14 @@
</section> </section>
<section class="panel"> <section class="panel">
<div class="panel-header"> <div class="panel-header panel-header--collapsible" @click="togglePanel('events')">
<h2>Events</h2> <h2>Events</h2>
<div style="display:flex;align-items:center;gap:10px;">
<span id="count" x-text="countText"></span> <span id="count" x-text="countText"></span>
<span class="panel-toggle" :class="panelState.events ? 'panel-toggle--open' : ''"></span>
</div> </div>
</div>
<div x-show="panelState.events">
<div id="status" class="status" aria-live="polite" x-text="statusText"></div> <div id="status" class="status" aria-live="polite" x-text="statusText"></div>
<div id="events" class="events"> <div id="events" class="events">
<template x-for="(evt, idx) in events" :key="evt._id || evt.id || idx"> <template x-for="(evt, idx) in events" :key="evt._id || evt.id || idx">
@@ -250,6 +415,7 @@
<span x-text="`Page ${cursorStack.length + 1}`"></span> <span x-text="`Page ${cursorStack.length + 1}`"></span>
<button type="button" id="nextPage" :disabled="!nextCursor" @click="goNext()">Next</button> <button type="button" id="nextPage" :disabled="!nextCursor" @click="goNext()">Next</button>
</div> </div>
</div>
</section> </section>
<div id="modal" class="modal hidden" role="dialog" aria-modal="true" aria-labelledby="modalTitle" :class="{ 'hidden': !modalOpen }"> <div id="modal" class="modal hidden" role="dialog" aria-modal="true" aria-labelledby="modalTitle" :class="{ 'hidden': !modalOpen }">
@@ -309,14 +475,24 @@
accessToken: null, accessToken: null,
authScopes: [], authScopes: [],
filters: { filters: {
actor: '', selectedServices: [], search: '', operation: '', result: '', start: '', end: '', limit: 25, includeTags: '', excludeTags: '', actor: '', selectedServices: [], search: '', operation: '', result: '', start: '', end: '', limit: 24, includeTags: '', excludeTags: '',
}, },
panelState: { sourceHealth: true, alerts: true, rules: true, filters: true, ask: true, events: true },
options: { actors: [], services: [], operations: [], results: [] }, options: { actors: [], services: [], operations: [], results: [] },
savedSearches: [], savedSearches: [],
appVersion: '', appVersion: '',
repoUrl: 'https://git.cqre.net/cqrenet/aoc', repoUrl: 'https://git.cqre.net/cqrenet/aoc',
docsUrl: 'https://git.cqre.net/cqrenet/aoc/src/branch/main/README.md', docsUrl: 'https://git.cqre.net/cqrenet/aoc/src/branch/main/README.md',
aiFeaturesEnabled: true, aiFeaturesEnabled: true,
alertSummary: { total_open: 0, high: 0, medium: 0, low: 0 },
alerts: [],
alertsTotal: 0,
alertsPage: 1,
alertsFilter: { status: 'open', severity: '' },
rules: [],
ruleModalOpen: false,
ruleEditId: null,
ruleEdit: { name: '', enabled: true, severity: 'medium', message: '', conditions: [] },
askQuestionText: '', askQuestionText: '',
askLoading: false, askLoading: false,
askAnswer: '', askAnswer: '',
@@ -329,10 +505,14 @@
await this.loadVersion(); await this.loadVersion();
await this.initAuth(); await this.initAuth();
this.loadSavedFilters(); this.loadSavedFilters();
this.loadPanelState();
if (!this.authConfig?.auth_enabled || this.accessToken) { if (!this.authConfig?.auth_enabled || this.accessToken) {
await this.loadFilterOptions(); await this.loadFilterOptions();
await this.loadSavedSearches(); await this.loadSavedSearches();
await this.loadSourceHealth(); await this.loadSourceHealth();
await this.loadAlertSummary();
await this.loadAlerts();
await this.loadRules();
await this.loadEvents(); await this.loadEvents();
} }
}, },
@@ -355,12 +535,33 @@
} catch {} } catch {}
}, },
loadPanelState() {
try {
const saved = localStorage.getItem('aoc_panels');
if (saved) {
const parsed = JSON.parse(saved);
Object.keys(parsed).forEach((k) => { if (this.panelState[k] !== undefined) this.panelState[k] = parsed[k]; });
}
} catch {}
},
savePanelState() {
try {
localStorage.setItem('aoc_panels', JSON.stringify(this.panelState));
} catch {}
},
togglePanel(key) {
this.panelState[key] = !this.panelState[key];
this.savePanelState();
},
async loadVersion() { async loadVersion() {
try { try {
const res = await fetch('/api/version'); const res = await fetch('/api/version');
if (res.ok) { if (res.ok) {
const body = await res.json(); const body = await res.json();
this.appVersion = body.version || ''; this.appVersion = (body.version || '').replace(/^v/, '');
} }
} catch {} } catch {}
}, },
@@ -390,9 +591,15 @@
async initAuth() { async initAuth() {
try { try {
const res = await fetch('/api/config/auth'); const res = await fetch('/api/config/auth');
if (!res.ok) {
console.error('Auth config fetch failed:', res.status, res.statusText);
this.authConfig = { auth_enabled: false, _error: res.status };
} else {
this.authConfig = await res.json(); this.authConfig = await res.json();
} catch { }
this.authConfig = { auth_enabled: false }; } catch (err) {
console.error('Auth config fetch error:', err);
this.authConfig = { auth_enabled: false, _error: 'network' };
} }
try { try {
@@ -402,6 +609,8 @@
this.aiFeaturesEnabled = featBody.ai_features_enabled !== false; this.aiFeaturesEnabled = featBody.ai_features_enabled !== false;
if (featBody.default_page_size) { if (featBody.default_page_size) {
this.filters.limit = featBody.default_page_size; this.filters.limit = featBody.default_page_size;
} else {
this.filters.limit = 24;
} }
} else { } else {
this.aiFeaturesEnabled = true; this.aiFeaturesEnabled = true;
@@ -411,7 +620,17 @@
} }
if (!this.authConfig?.auth_enabled) { if (!this.authConfig?.auth_enabled) {
this.authBtnText = ''; this.authBtnText = 'Auth: OFF';
console.warn('AOC auth is disabled. Set AUTH_ENABLED=true in .env to enable login.');
return;
}
const tenantId = this.authConfig.tenant_id;
const clientId = this.authConfig.client_id;
if (!clientId || !tenantId) {
this.authBtnText = 'Auth: misconfigured';
this.statusText = 'Auth is enabled but client_id or tenant_id is missing. Check .env configuration.';
console.error('AOC auth misconfigured: missing client_id or tenant_id in /api/config/auth');
return; return;
} }
@@ -420,8 +639,6 @@
return; return;
} }
const tenantId = this.authConfig.tenant_id;
const clientId = this.authConfig.client_id;
const baseScope = this.authConfig.scope || ""; const baseScope = this.authConfig.scope || "";
this.authScopes = Array.from(new Set(['openid', 'profile', 'email', ...baseScope.split(/[ ,]+/).filter(Boolean)])); this.authScopes = Array.from(new Set(['openid', 'profile', 'email', ...baseScope.split(/[ ,]+/).filter(Boolean)]));
const authority = `https://login.microsoftonline.com/${tenantId}`; const authority = `https://login.microsoftonline.com/${tenantId}`;
@@ -571,9 +788,8 @@
const saved = localStorage.getItem('aoc_filters'); const saved = localStorage.getItem('aoc_filters');
if (!saved && this.options.services.length) { if (!saved && this.options.services.length) {
// Default: exclude noisy high-volume services // Default: show all services (privacy controls handle exclusions server-side)
const noisy = ['Exchange', 'SharePoint', 'Teams']; this.filters.selectedServices = [...this.options.services];
this.filters.selectedServices = this.options.services.filter((s) => !noisy.includes(s));
} else if (saved) { } else if (saved) {
try { try {
const parsed = JSON.parse(saved); const parsed = JSON.parse(saved);
@@ -667,26 +883,15 @@
}, },
clearFilters() { clearFilters() {
const noisy = ['Exchange', 'SharePoint', 'Teams']; this.filters = { actor: '', selectedServices: [...this.options.services], search: '', operation: '', result: '', start: '', end: '', limit: 24, includeTags: '', excludeTags: '' };
this.filters = { actor: '', selectedServices: this.options.services.filter((s) => !noisy.includes(s)), search: '', operation: '', result: '', start: '', end: '', limit: 25, includeTags: '', excludeTags: '' };
this.saveFilters(); this.saveFilters();
this.resetPagination(); this.resetPagination();
this.loadEvents(); this.loadEvents();
}, },
addServiceFilter(service) { filterByService(service) {
if (!service) return; if (!service) return;
if (!this.filters.selectedServices.includes(service)) { this.filters.selectedServices = [service];
this.filters.selectedServices.push(service);
this.saveFilters();
this.resetPagination();
this.loadEvents();
}
},
removeServiceFilter(service) {
if (!service) return;
this.filters.selectedServices = this.filters.selectedServices.filter((s) => s !== service);
this.saveFilters(); this.saveFilters();
this.resetPagination(); this.resetPagination();
this.loadEvents(); this.loadEvents();
@@ -700,6 +905,115 @@
this.loadEvents(); this.loadEvents();
}, },
async loadAlertSummary() {
try {
const res = await fetch('/api/alerts/summary', { headers: this.authHeader() });
if (!res.ok) return;
const body = await res.json();
this.alertSummary.total_open = body.total_open || 0;
const sev = body.by_status_severity || [];
this.alertSummary.high = sev.filter((s) => s._id.severity === 'high' && s._id.status === 'open').reduce((a, b) => a + b.count, 0);
this.alertSummary.medium = sev.filter((s) => s._id.severity === 'medium' && s._id.status === 'open').reduce((a, b) => a + b.count, 0);
this.alertSummary.low = sev.filter((s) => s._id.severity === 'low' && s._id.status === 'open').reduce((a, b) => a + b.count, 0);
} catch {}
},
async loadAlerts() {
try {
const params = new URLSearchParams();
params.append('page_size', '20');
params.append('page', String(this.alertsPage));
if (this.alertsFilter.status) params.append('status', this.alertsFilter.status);
if (this.alertsFilter.severity) params.append('severity', this.alertsFilter.severity);
const res = await fetch(`/api/alerts?${params.toString()}`, { headers: this.authHeader() });
if (!res.ok) return;
const body = await res.json();
this.alerts = body.items || [];
this.alertsTotal = body.total || 0;
} catch {}
},
async updateAlertStatus(alertId, status) {
try {
const res = await fetch(`/api/alerts/${alertId}/status`, {
method: 'PATCH',
headers: { 'Content-Type': 'application/json', ...this.authHeader() },
body: JSON.stringify({ status }),
});
if (res.ok) {
await this.loadAlerts();
await this.loadAlertSummary();
}
} catch {}
},
async loadRules() {
try {
const res = await fetch('/api/rules', { headers: this.authHeader() });
if (!res.ok) return;
this.rules = await res.json();
} catch {}
},
openRuleEditor(rule) {
if (rule) {
this.ruleEditId = rule.id;
this.ruleEdit = {
name: rule.name,
enabled: rule.enabled,
severity: rule.severity,
message: rule.message,
conditions: JSON.parse(JSON.stringify(rule.conditions)),
};
} else {
this.ruleEditId = null;
this.ruleEdit = { name: '', enabled: true, severity: 'medium', message: '', conditions: [] };
}
this.ruleModalOpen = true;
},
async saveRule() {
const payload = { ...this.ruleEdit };
try {
const url = this.ruleEditId ? `/api/rules/${this.ruleEditId}` : '/api/rules';
const method = this.ruleEditId ? 'PUT' : 'POST';
const res = await fetch(url, {
method,
headers: { 'Content-Type': 'application/json', ...this.authHeader() },
body: JSON.stringify(payload),
});
if (!res.ok) throw new Error(await res.text());
this.ruleModalOpen = false;
await this.loadRules();
} catch (err) {
alert('Failed to save rule: ' + err.message);
}
},
async toggleRule(ruleId, enabled) {
try {
const rule = this.rules.find((r) => r.id === ruleId);
if (!rule) return;
const res = await fetch(`/api/rules/${ruleId}`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json', ...this.authHeader() },
body: JSON.stringify({ ...rule, enabled }),
});
if (res.ok) await this.loadRules();
} catch {}
},
async deleteRule(ruleId) {
if (!confirm('Delete this rule?')) return;
try {
const res = await fetch(`/api/rules/${ruleId}`, {
method: 'DELETE',
headers: this.authHeader(),
});
if (res.ok) await this.loadRules();
} catch {}
},
async askQuestion() { async askQuestion() {
const q = this.askQuestionText.trim(); const q = this.askQuestionText.trim();
if (!q) return; if (!q) return;

View File

@@ -274,6 +274,31 @@ input {
margin-bottom: 8px; margin-bottom: 8px;
} }
.panel-header--collapsible {
cursor: pointer;
user-select: none;
padding: 4px 0;
margin-bottom: 0;
}
.panel-header--collapsible:hover {
opacity: 0.85;
}
.panel-toggle {
display: inline-block;
font-size: 14px;
color: var(--muted);
transition: transform 0.2s ease;
transform: rotate(-90deg);
width: 16px;
text-align: center;
}
.panel-toggle--open {
transform: rotate(0deg);
}
#count { #count {
color: var(--muted); color: var(--muted);
font-size: 14px; font-size: 14px;
@@ -376,31 +401,6 @@ input {
background: rgba(249, 115, 22, 0.25); background: rgba(249, 115, 22, 0.25);
} }
.pill__action {
display: inline-flex;
align-items: center;
justify-content: center;
width: 16px;
height: 16px;
border-radius: 4px;
margin-left: 4px;
font-size: 11px;
font-weight: 700;
line-height: 1;
cursor: pointer;
color: var(--muted);
background: rgba(255, 255, 255, 0.06);
border: 1px solid transparent;
transition: all 0.12s ease;
vertical-align: middle;
}
.pill__action:hover {
color: var(--text);
background: rgba(125, 211, 252, 0.2);
border-color: rgba(125, 211, 252, 0.3);
}
.event h3 { .event h3 {
margin: 0 0 6px; margin: 0 0 6px;
font-size: 17px; font-size: 17px;
@@ -716,6 +716,257 @@ input {
font-size: 12px; font-size: 12px;
} }
/* Alert summary in hero */
.alert-summary {
display: flex;
align-items: center;
gap: 6px;
background: rgba(255, 255, 255, 0.04);
border: 1px solid var(--border);
border-radius: 999px;
padding: 6px 14px;
}
.alert-badge {
min-width: 22px;
height: 22px;
border-radius: 999px;
display: flex;
align-items: center;
justify-content: center;
font-size: 11px;
font-weight: 700;
color: #0b1220;
}
.alert-badge--high {
background: #ef4444;
}
.alert-badge--medium {
background: #f97316;
}
.alert-badge--low {
background: #3b82f6;
}
.alert-label {
font-size: 12px;
color: var(--muted);
}
.alert-open-count {
font-size: 13px;
color: var(--muted);
}
.alert-filters {
display: flex;
gap: 10px;
margin-bottom: 12px;
}
.alert-filters select {
padding: 8px 12px;
border-radius: 8px;
border: 1px solid var(--border);
background: rgba(255, 255, 255, 0.02);
color: var(--text);
font-size: 13px;
}
.alerts-list {
display: flex;
flex-direction: column;
gap: 10px;
}
.alert-card {
border: 1px solid var(--border);
border-radius: 12px;
padding: 12px 14px;
background: rgba(255, 255, 255, 0.02);
border-left: 3px solid transparent;
}
.alert-card--high {
border-left-color: #ef4444;
}
.alert-card--medium {
border-left-color: #f97316;
}
.alert-card--low {
border-left-color: #3b82f6;
}
.alert-card__meta {
display: flex;
gap: 8px;
align-items: center;
margin-bottom: 6px;
flex-wrap: wrap;
}
.alert-card__meta small {
color: var(--muted);
font-size: 12px;
}
.alert-card strong {
font-size: 14px;
display: block;
margin-bottom: 4px;
}
.alert-card p {
margin: 0 0 10px;
font-size: 13px;
color: var(--muted);
line-height: 1.45;
}
.alert-card__actions {
display: flex;
gap: 8px;
flex-wrap: wrap;
}
.alerts-empty {
padding: 20px;
text-align: center;
color: var(--muted);
font-size: 14px;
border: 1px dashed var(--border);
border-radius: 10px;
}
/* Rules management */
.rules-list {
display: flex;
flex-direction: column;
gap: 10px;
}
.rule-card {
border: 1px solid var(--border);
border-radius: 12px;
padding: 12px 14px;
background: rgba(255, 255, 255, 0.02);
}
.rule-card--disabled {
opacity: 0.6;
}
.rule-card__meta {
display: flex;
gap: 8px;
align-items: center;
margin-bottom: 6px;
flex-wrap: wrap;
}
.toggle-label {
display: flex;
align-items: center;
gap: 6px;
font-size: 12px;
color: var(--muted);
cursor: pointer;
}
.toggle-label input[type="checkbox"] {
width: 14px;
height: 14px;
accent-color: var(--accent-strong);
}
.rule-card strong {
font-size: 14px;
display: block;
margin-bottom: 4px;
}
.rule-card p {
margin: 0 0 8px;
font-size: 13px;
color: var(--muted);
line-height: 1.4;
}
.rule-card__conditions {
display: flex;
flex-wrap: wrap;
gap: 6px;
margin-bottom: 10px;
}
.rule-card__actions {
display: flex;
gap: 8px;
}
.rules-empty {
padding: 20px;
text-align: center;
color: var(--muted);
font-size: 14px;
border: 1px dashed var(--border);
border-radius: 10px;
}
.rule-form {
display: flex;
flex-direction: column;
gap: 14px;
}
.rule-form label {
display: flex;
flex-direction: column;
gap: 6px;
font-size: 14px;
color: var(--muted);
}
.rule-form input,
.rule-form select,
.rule-form textarea {
padding: 10px 12px;
border-radius: 10px;
border: 1px solid var(--border);
background: rgba(255, 255, 255, 0.02);
color: var(--text);
font-size: 14px;
}
.rule-conditions {
display: flex;
flex-direction: column;
gap: 10px;
}
.condition-row {
display: flex;
gap: 8px;
align-items: center;
}
.condition-row input,
.condition-row select {
flex: 1;
min-width: 0;
}
.rule-form__actions {
display: flex;
gap: 10px;
margin-top: 8px;
}
@media (max-width: 640px) { @media (max-width: 640px) {
.topbar { .topbar {
flex-direction: column; flex-direction: column;

View File

@@ -1,12 +1,13 @@
import asyncio import asyncio
import logging import logging
import os
import time import time
from contextlib import suppress from contextlib import suppress
from pathlib import Path from pathlib import Path
import structlog import structlog
from audit_trail import log_action from audit_trail import log_action
from config import AI_FEATURES_ENABLED, CORS_ORIGINS, ENABLE_PERIODIC_FETCH, FETCH_INTERVAL_MINUTES from config import AI_FEATURES_ENABLED, AUTH_ENABLED, CORS_ORIGINS, ENABLE_PERIODIC_FETCH, FETCH_INTERVAL_MINUTES
from database import setup_indexes from database import setup_indexes
from fastapi import FastAPI, HTTPException, Request from fastapi import FastAPI, HTTPException, Request
from fastapi.middleware.cors import CORSMiddleware from fastapi.middleware.cors import CORSMiddleware
@@ -14,6 +15,7 @@ from fastapi.responses import Response
from fastapi.staticfiles import StaticFiles from fastapi.staticfiles import StaticFiles
from metrics import observe_request, prometheus_metrics from metrics import observe_request, prometheus_metrics
from middleware import CorrelationIdMiddleware from middleware import CorrelationIdMiddleware
from routes.alerts import router as alerts_router
from routes.config import router as config_router from routes.config import router as config_router
from routes.events import router as events_router from routes.events import router as events_router
from routes.fetch import router as fetch_router from routes.fetch import router as fetch_router
@@ -51,10 +53,17 @@ logger = structlog.get_logger("aoc.fetcher")
app = FastAPI() app = FastAPI()
# CORS: warn if wildcard is used with auth enabled, but do not break deployments
_effective_cors = CORS_ORIGINS
if AUTH_ENABLED and "*" in _effective_cors:
logger.warning(
"CORS wildcard (*) is insecure when AUTH_ENABLED=true. Set CORS_ORIGINS to your actual origin(s) in production."
)
app.add_middleware(CorrelationIdMiddleware) app.add_middleware(CorrelationIdMiddleware)
app.add_middleware( app.add_middleware(
CORSMiddleware, CORSMiddleware,
allow_origins=CORS_ORIGINS, allow_origins=_effective_cors,
allow_credentials=True, allow_credentials=True,
allow_methods=["*"], allow_methods=["*"],
allow_headers=["*"], allow_headers=["*"],
@@ -79,27 +88,43 @@ async def cache_control_middleware(request: Request, call_next):
response.headers["Cache-Control"] = "no-cache, no-store, must-revalidate" response.headers["Cache-Control"] = "no-cache, no-store, must-revalidate"
response.headers["Pragma"] = "no-cache" response.headers["Pragma"] = "no-cache"
response.headers["Expires"] = "0" response.headers["Expires"] = "0"
# Basic CSP for the UI and API (allows MSAL auth flows)
if request.url.path.startswith("/api/") or request.url.path in ("/", "/index.html"):
response.headers["Content-Security-Policy"] = (
"default-src 'self'; "
"script-src 'self' 'unsafe-inline' cdn.jsdelivr.net alcdn.msauth.net; "
"style-src 'self' 'unsafe-inline'; "
"connect-src 'self' https://login.microsoftonline.com; "
"frame-src 'self' https://login.microsoftonline.com; "
"form-action 'self' https://login.microsoftonline.com; "
"img-src 'self' data:;"
)
return response return response
@app.middleware("http")
async def rate_limit_middleware(request: Request, call_next):
"""Apply Redis-backed rate limiting before processing the request."""
# Exempt config and health endpoints from rate limiting
exempt_paths = {"/api/config/auth", "/api/config/features", "/health", "/metrics"}
if request.url.path.startswith("/api/") and request.url.path not in exempt_paths:
from rate_limiter import check_rate_limit
await check_rate_limit(request)
return await call_next(request)
@app.middleware("http") @app.middleware("http")
async def audit_middleware(request: Request, call_next): async def audit_middleware(request: Request, call_next):
response = await call_next(request) response = await call_next(request)
if request.url.path.startswith("/api/") and request.method in ("POST", "PATCH", "PUT", "DELETE"): if request.url.path.startswith("/api/") and request.method in ("POST", "PATCH", "PUT", "DELETE"):
from auth import AUTH_ENABLED
user = "anonymous" user = "anonymous"
if AUTH_ENABLED: if AUTH_ENABLED:
auth_header = request.headers.get("authorization", "") from auth import _auth_context
if auth_header.lower().startswith("bearer "):
try:
from jose import jwt
token = auth_header.split(" ", 1)[1] claims = _auth_context.get(None)
claims = jwt.get_unverified_claims(token) if isinstance(claims, dict):
user = claims.get("sub", "unknown") user = claims.get("sub", "unknown")
except Exception:
pass
log_action( log_action(
action=request.method.lower(), action=request.method.lower(),
resource=request.url.path, resource=request.url.path,
@@ -123,6 +148,7 @@ if AI_FEATURES_ENABLED:
app.mount("/mcp", mcp_asgi) app.mount("/mcp", mcp_asgi)
app.include_router(saved_searches_router, prefix="/api") app.include_router(saved_searches_router, prefix="/api")
app.include_router(rules_router, prefix="/api") app.include_router(rules_router, prefix="/api")
app.include_router(alerts_router, prefix="/api")
app.include_router(jobs_router, prefix="/api") app.include_router(jobs_router, prefix="/api")
@@ -145,11 +171,28 @@ async def metrics():
@app.get("/api/version") @app.get("/api/version")
async def version(): async def version():
import os
return {"version": os.environ.get("VERSION", "unknown")} return {"version": os.environ.get("VERSION", "unknown")}
@app.exception_handler(Exception)
async def generic_exception_handler(request: Request, exc: Exception):
"""Return generic error messages for unhandled exceptions to avoid info leakage."""
if isinstance(exc, HTTPException):
from fastapi.responses import JSONResponse
return JSONResponse(
status_code=exc.status_code,
content={"detail": exc.detail},
headers=getattr(exc, "headers", None) or {},
)
logger.error("Unhandled exception", path=request.url.path, error=str(exc))
return Response(
content='{"detail":"Internal server error"}',
status_code=500,
media_type="application/json",
)
frontend_dir = Path(__file__).parent / "frontend" frontend_dir = Path(__file__).parent / "frontend"
app.mount("/", StaticFiles(directory=frontend_dir, html=True), name="frontend") app.mount("/", StaticFiles(directory=frontend_dir, html=True), name="frontend")
@@ -167,6 +210,15 @@ async def _periodic_fetch():
@app.on_event("startup") @app.on_event("startup")
async def start_periodic_fetch(): async def start_periodic_fetch():
setup_indexes() setup_indexes()
from rules import seed_default_rules
seed_default_rules()
logger.info(
"AOC startup",
version=os.environ.get("VERSION", "unknown"),
auth_enabled=AUTH_ENABLED,
ai_enabled=AI_FEATURES_ENABLED,
)
if ENABLE_PERIODIC_FETCH: if ENABLE_PERIODIC_FETCH:
app.state.fetch_task = asyncio.create_task(_periodic_fetch()) app.state.fetch_task = asyncio.create_task(_periodic_fetch())

View File

@@ -41,6 +41,15 @@ from mcp_common import (
handle_search_events, handle_search_events,
) )
# Security warning: this standalone stdio server has no authentication.
# Only run it in trusted environments (e.g. local Claude Desktop) and
# ensure the MongoDB connection uses authenticated credentials.
print("=" * 60, file=sys.stderr)
print("AOC MCP Server (stdio transport)", file=sys.stderr)
print("WARNING: No authentication layer. Only run in trusted", file=sys.stderr)
print("environments or behind a VPN. See AGENTS.md for details.", file=sys.stderr)
print("=" * 60, file=sys.stderr)
app = Server("aoc") app = Server("aoc")

View File

@@ -63,12 +63,18 @@ class CommentAddRequest(BaseModel):
text: str text: str
class AlertCondition(BaseModel):
field: str
op: str # eq, neq, contains, in, after_hours
value: str | list[str] | None = None
class AlertRuleResponse(BaseModel): class AlertRuleResponse(BaseModel):
id: str | None = None id: str | None = None
name: str name: str
enabled: bool enabled: bool
severity: str severity: str
conditions: list[dict] conditions: list[AlertCondition]
message: str message: str

172
backend/notifications.py Normal file
View File

@@ -0,0 +1,172 @@
"""Pluggable notification channels for admin-ops alerts.
Supported channels:
- webhook: POST JSON to any URL (Slack, Teams, generic)
"""
from datetime import UTC, datetime
import requests
import structlog
from tenacity import retry, retry_if_exception_type, stop_after_attempt, wait_exponential
logger = structlog.get_logger("aoc.notifications")
WEBHOOK_TIMEOUT = 15
@retry(
stop=stop_after_attempt(3),
wait=wait_exponential(multiplier=1, min=2, max=10),
retry=retry_if_exception_type((requests.ConnectionError, requests.Timeout)),
reraise=True,
)
def _post_webhook(url: str, payload: dict) -> requests.Response:
"""POST to webhook with retry on connection/timeout errors."""
return requests.post(url, json=payload, timeout=WEBHOOK_TIMEOUT, headers={"Content-Type": "application/json"})
def _build_slack_payload(rule_name: str, severity: str, message: str, event: dict) -> dict:
"""Build a Slack-compatible block payload."""
color = {"high": "#ef4444", "medium": "#f97316", "low": "#3b82f6"}.get(severity, "#94a3b8")
ts = event.get("timestamp", "?")
op = event.get("operation", "unknown")
actor = event.get("actor_display", "unknown")
targets = ", ".join(event.get("target_displays", [])) or ""
svc = event.get("service", "unknown")
return {
"text": f"[{severity.upper()}] {rule_name}: {message}",
"attachments": [
{
"color": color,
"fields": [
{"title": "Rule", "value": rule_name, "short": True},
{"title": "Severity", "value": severity.upper(), "short": True},
{"title": "Service", "value": svc, "short": True},
{"title": "Action", "value": op, "short": True},
{"title": "Actor", "value": actor, "short": True},
{"title": "Target", "value": targets, "short": True},
{"title": "Time", "value": ts, "short": False},
],
"footer": "AOC Admin Operations Center",
}
],
}
def _build_teams_payload(rule_name: str, severity: str, message: str, event: dict) -> dict:
"""Build a Microsoft Teams adaptive card payload."""
color = {"high": "Attention", "medium": "Warning", "low": "Good"}.get(severity, "Default")
ts = event.get("timestamp", "?")
op = event.get("operation", "unknown")
actor = event.get("actor_display", "unknown")
targets = ", ".join(event.get("target_displays", [])) or ""
svc = event.get("service", "unknown")
return {
"type": "message",
"attachments": [
{
"contentType": "application/vnd.microsoft.card.adaptive",
"content": {
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"type": "AdaptiveCard",
"version": "1.4",
"body": [
{
"type": "TextBlock",
"text": f"🚨 {severity.upper()}: {rule_name}",
"weight": "Bolder",
"size": "Medium",
"color": color,
},
{"type": "TextBlock", "text": message, "wrap": True},
{
"type": "FactSet",
"facts": [
{"title": "Service:", "value": svc},
{"title": "Action:", "value": op},
{"title": "Actor:", "value": actor},
{"title": "Target:", "value": targets},
{"title": "Time:", "value": ts},
],
},
],
},
}
],
}
def _build_generic_payload(rule_name: str, severity: str, message: str, event: dict) -> dict:
"""Build a generic JSON payload."""
return {
"alert": {
"rule_name": rule_name,
"severity": severity,
"message": message,
"timestamp": datetime.now(UTC).isoformat(),
},
"event": {
"id": event.get("id"),
"timestamp": event.get("timestamp"),
"service": event.get("service"),
"operation": event.get("operation"),
"actor_display": event.get("actor_display"),
"target_displays": event.get("target_displays"),
"result": event.get("result"),
},
}
def send_notification(
webhook_url: str,
format_type: str,
rule_name: str,
severity: str,
message: str,
event: dict,
) -> bool:
"""Send an alert notification to the configured channel.
Args:
webhook_url: URL to POST to.
format_type: "slack", "teams", or "generic".
rule_name: Name of the triggered rule.
severity: high, medium, or low.
message: Human-readable alert message.
event: The normalized event that triggered the alert.
Returns:
True if delivery succeeded, False otherwise.
"""
if not webhook_url:
return False
builders = {
"slack": _build_slack_payload,
"teams": _build_teams_payload,
"generic": _build_generic_payload,
}
builder = builders.get(format_type, _build_generic_payload)
payload = builder(rule_name, severity, message, event)
try:
res = _post_webhook(webhook_url, payload)
res.raise_for_status()
logger.info(
"Notification sent",
rule=rule_name,
severity=severity,
format=format_type,
status_code=res.status_code,
)
return True
except Exception as exc:
logger.warning(
"Notification failed after retries",
rule=rule_name,
severity=severity,
format=format_type,
error=str(exc),
)
return False

82
backend/rate_limiter.py Normal file
View File

@@ -0,0 +1,82 @@
"""Simple Redis-backed fixed-window rate limiter."""
import time
import structlog
from config import RATE_LIMIT_ENABLED, RATE_LIMIT_REQUESTS, RATE_LIMIT_WINDOW_SECONDS
from fastapi import HTTPException, Request
from redis_client import get_redis
logger = structlog.get_logger("aoc.rate_limit")
class RateLimitExceeded(HTTPException):
def __init__(self, retry_after: int):
super().__init__(
status_code=429,
detail="Rate limit exceeded. Please slow down.",
headers={"Retry-After": str(retry_after)},
)
def _get_identifier(request: Request) -> str:
"""Best-effort client identifier: authenticated sub, or X-Forwarded-For, or client host."""
user = getattr(request.state, "user", None)
if user and isinstance(user, dict):
sub = user.get("sub")
if sub and sub != "anonymous":
return f"user:{sub}"
forwarded = request.headers.get("x-forwarded-for")
if forwarded:
return f"ip:{forwarded.split(',')[0].strip()}"
return f"ip:{request.client.host if request.client else 'unknown'}"
def _get_path_category(path: str) -> str:
"""Bucket paths into rate-limit categories."""
if path.startswith("/api/fetch"):
return "fetch"
if path.startswith("/api/ask"):
return "ask"
if path.startswith("/api/events/bulk-tags"):
return "write"
return "default"
def _limit_for_category(category: str) -> tuple[int, int]:
"""Return (max_requests, window_seconds) for a category."""
if category == "fetch":
return (10, 3600) # 10 per hour
if category == "ask":
return (30, 60) # 30 per minute
if category == "write":
return (20, 60) # 20 per minute
return (RATE_LIMIT_REQUESTS, RATE_LIMIT_WINDOW_SECONDS)
async def check_rate_limit(request: Request):
"""Raise RateLimitExceeded if the client has exceeded their quota."""
if not RATE_LIMIT_ENABLED:
return
category = _get_path_category(request.url.path)
limit, window = _limit_for_category(category)
identifier = _get_identifier(request)
now = int(time.time())
window_key = now // window
redis_key = f"rate_limit:{identifier}:{category}:{window_key}"
try:
redis = await get_redis()
count = await redis.incr(redis_key)
if count == 1:
await redis.expire(redis_key, window)
if count > limit:
raise RateLimitExceeded(retry_after=window - (now % window))
except RateLimitExceeded:
raise
except Exception as exc:
logger.warning("Rate limiter Redis error; allowing request", error=str(exc))

78
backend/routes/alerts.py Normal file
View File

@@ -0,0 +1,78 @@
"""Alert management endpoints."""
from auth import require_auth
from bson import ObjectId
from database import alerts_collection
from fastapi import APIRouter, Depends, HTTPException, Query
from pydantic import BaseModel
router = APIRouter(dependencies=[Depends(require_auth)])
class AlertStatusUpdate(BaseModel):
status: str # open | acknowledged | resolved | false_positive
class AlertListResponse(BaseModel):
items: list[dict]
total: int
@router.get("/alerts", response_model=AlertListResponse)
def list_alerts(
status: str = Query(default="", description="Filter by status"),
severity: str = Query(default="", description="Filter by severity"),
rule_name: str = Query(default="", description="Filter by rule name"),
page_size: int = Query(default=50, ge=1, le=200),
page: int = Query(default=1, ge=1),
):
query = {}
if status:
query["status"] = status
if severity:
query["severity"] = severity
if rule_name:
query["rule_name"] = {"$regex": rule_name, "$options": "i"}
total = alerts_collection.count_documents(query)
skip = (page - 1) * page_size
cursor = alerts_collection.find(query, {"_id": 0}).sort("timestamp", -1).skip(skip).limit(page_size)
return {"items": list(cursor), "total": total}
@router.patch("/alerts/{alert_id}/status")
def update_alert_status(alert_id: str, body: AlertStatusUpdate):
result = alerts_collection.update_one(
{"_id": ObjectId(alert_id)},
{"$set": {"status": body.status}},
)
if result.matched_count == 0:
raise HTTPException(status_code=404, detail="Alert not found")
return {"updated": True, "status": body.status}
@router.get("/alerts/summary")
def alert_summary():
"""Return counts by status and severity for the dashboard."""
pipeline = [
{
"$group": {
"_id": {"status": "$status", "severity": "$severity"},
"count": {"$sum": 1},
}
}
]
by_status_severity = list(alerts_collection.aggregate(pipeline))
total_open = alerts_collection.count_documents({"status": "open"})
total_acknowledged = alerts_collection.count_documents({"status": "acknowledged"})
total_resolved = alerts_collection.count_documents({"status": "resolved"})
total_false_positive = alerts_collection.count_documents({"status": "false_positive"})
return {
"total_open": total_open,
"total_acknowledged": total_acknowledged,
"total_resolved": total_resolved,
"total_false_positive": total_false_positive,
"by_status_severity": by_status_severity,
}

View File

@@ -397,8 +397,31 @@ def _format_events_for_llm(
return "\n".join(lines) return "\n".join(lines)
def _validate_llm_url(url: str):
"""Prevent SSRF by rejecting internal/reserved addresses."""
from urllib.parse import urlparse
parsed = urlparse(url)
if parsed.scheme != "https":
raise RuntimeError("LLM_BASE_URL must use HTTPS")
hostname = (parsed.hostname or "").lower()
if not hostname:
raise RuntimeError("LLM_BASE_URL must have a valid hostname")
blocked = {"localhost", "127.0.0.1", "0.0.0.0", "::1", "169.254.169.254"}
if hostname in blocked:
raise RuntimeError(f"LLM_BASE_URL hostname '{hostname}' is not allowed")
# Block link-local and private IP ranges
import ipaddress
try:
ip = ipaddress.ip_address(hostname)
if ip.is_private or ip.is_loopback or ip.is_link_local or ip.is_reserved:
raise RuntimeError(f"LLM_BASE_URL IP '{hostname}' is not allowed")
except ValueError:
pass # hostname is not an IP, which is fine
def _build_chat_url(base_url: str, api_version: str) -> str: def _build_chat_url(base_url: str, api_version: str) -> str:
"""Construct the chat completions URL, handling Azure OpenAI endpoints."""
base = base_url.rstrip("/") base = base_url.rstrip("/")
url = base if base.endswith("/chat/completions") else f"{base}/chat/completions" url = base if base.endswith("/chat/completions") else f"{base}/chat/completions"
if api_version: if api_version:
@@ -424,6 +447,9 @@ async def _call_llm(
}, },
] ]
# SSRF guard: only allow known public HTTPS endpoints
_validate_llm_url(LLM_BASE_URL)
url = _build_chat_url(LLM_BASE_URL, LLM_API_VERSION) url = _build_chat_url(LLM_BASE_URL, LLM_API_VERSION)
headers = { headers = {
"Content-Type": "application/json", "Content-Type": "application/json",
@@ -570,6 +596,8 @@ async def _explain_event(event: dict, related: list[dict]) -> str:
}, },
] ]
_validate_llm_url(LLM_BASE_URL)
url = _build_chat_url(LLM_BASE_URL, LLM_API_VERSION) url = _build_chat_url(LLM_BASE_URL, LLM_API_VERSION)
headers = {"Content-Type": "application/json"} headers = {"Content-Type": "application/json"}
if "azure" in LLM_BASE_URL.lower() or "cognitiveservices" in LLM_BASE_URL.lower(): if "azure" in LLM_BASE_URL.lower() or "cognitiveservices" in LLM_BASE_URL.lower():
@@ -731,7 +759,7 @@ async def ask_question(body: AskRequest, user: dict = Depends(require_auth)):
raw_events = list(cursor) raw_events = list(cursor)
except Exception as exc: except Exception as exc:
logger.error("Failed to query events for ask", error=str(exc)) logger.error("Failed to query events for ask", error=str(exc))
raise HTTPException(status_code=500, detail=f"Database query failed: {exc}") from exc raise HTTPException(status_code=500, detail="Database query failed") from exc
for e in raw_events: for e in raw_events:
e["_id"] = str(e.get("_id", "")) e["_id"] = str(e.get("_id", ""))
@@ -803,7 +831,6 @@ async def ask_question(body: AskRequest, user: dict = Depends(require_auth)):
"total_matched": total, "total_matched": total,
"services_queried": query_services, "services_queried": query_services,
"excluded_services": excluded_services, "excluded_services": excluded_services,
"mongo_query": json.dumps(query, default=str),
}, },
llm_used=False, llm_used=False,
llm_error=None, llm_error=None,
@@ -863,7 +890,6 @@ async def ask_question(body: AskRequest, user: dict = Depends(require_auth)):
"total_matched": total, "total_matched": total,
"services_queried": query_services, "services_queried": query_services,
"excluded_services": excluded_services, "excluded_services": excluded_services,
"mongo_query": json.dumps(query, default=str),
}, },
llm_used=llm_used, llm_used=llm_used,
llm_error=llm_error, llm_error=llm_error,

View File

@@ -1,3 +1,4 @@
import structlog
from config import ( from config import (
AI_FEATURES_ENABLED, AI_FEATURES_ENABLED,
AUTH_CLIENT_ID, AUTH_CLIENT_ID,
@@ -9,10 +10,12 @@ from config import (
from fastapi import APIRouter from fastapi import APIRouter
router = APIRouter() router = APIRouter()
logger = structlog.get_logger("aoc.config")
@router.get("/config/auth") @router.get("/config/auth")
def auth_config(): def auth_config():
logger.debug("Auth config requested", auth_enabled=AUTH_ENABLED)
return { return {
"auth_enabled": AUTH_ENABLED, "auth_enabled": AUTH_ENABLED,
"tenant_id": AUTH_TENANT_ID, "tenant_id": AUTH_TENANT_ID,

View File

@@ -158,7 +158,7 @@ def list_events(
cursor_query = events_collection.find(query).sort([("timestamp", -1), ("_id", -1)]).limit(safe_page_size) cursor_query = events_collection.find(query).sort([("timestamp", -1), ("_id", -1)]).limit(safe_page_size)
events = list(cursor_query) events = list(cursor_query)
except Exception as exc: except Exception as exc:
raise HTTPException(status_code=500, detail=f"Failed to query events: {exc}") from exc raise HTTPException(status_code=500, detail="Failed to query events") from exc
next_cursor = None next_cursor = None
if len(events) == safe_page_size: if len(events) == safe_page_size:
@@ -241,9 +241,17 @@ def bulk_tags(
update = {"$set": {"tags": tags}} if body.mode == "replace" else {"$addToSet": {"tags": {"$each": tags}}} update = {"$set": {"tags": tags}} if body.mode == "replace" else {"$addToSet": {"tags": {"$each": tags}}}
try: try:
matched = events_collection.count_documents(query, limit=10001)
if matched > 10000:
raise HTTPException(
status_code=400,
detail="Bulk tag update matches too many events (>10000). Narrow your filters.",
)
result_obj = events_collection.update_many(query, update) result_obj = events_collection.update_many(query, update)
except HTTPException:
raise
except Exception as exc: except Exception as exc:
raise HTTPException(status_code=500, detail=f"Failed to update tags: {exc}") from exc raise HTTPException(status_code=500, detail="Failed to update tags") from exc
log_action( log_action(
"bulk_tags", "bulk_tags",
@@ -268,7 +276,7 @@ def filter_options(
actor_upns = sorted([a for a in events_collection.distinct("actor_upn") if a])[:safe_limit] actor_upns = sorted([a for a in events_collection.distinct("actor_upn") if a])[:safe_limit]
devices = sorted([a for a in events_collection.distinct("target_displays") if isinstance(a, str)])[:safe_limit] devices = sorted([a for a in events_collection.distinct("target_displays") if isinstance(a, str)])[:safe_limit]
except Exception as exc: except Exception as exc:
raise HTTPException(status_code=500, detail=f"Failed to load filter options: {exc}") from exc raise HTTPException(status_code=500, detail="Failed to load filter options") from exc
if not user_can_access_privacy_services(user): if not user_can_access_privacy_services(user):
services = [s for s in services if s not in PRIVACY_SERVICES] services = [s for s in services if s not in PRIVACY_SERVICES]

View File

@@ -1,5 +1,6 @@
import time import time
import structlog
from audit_trail import log_action from audit_trail import log_action
from auth import require_auth from auth import require_auth
from config import ALERTS_ENABLED from config import ALERTS_ENABLED
@@ -15,6 +16,8 @@ from sources.intune_audit import fetch_intune_audit
from sources.unified_audit import fetch_unified_audit from sources.unified_audit import fetch_unified_audit
from watermark import get_watermark, set_watermark from watermark import get_watermark, set_watermark
logger = structlog.get_logger("aoc.fetch")
router = APIRouter(dependencies=[Depends(require_auth)]) router = APIRouter(dependencies=[Depends(require_auth)])
@@ -85,5 +88,8 @@ def fetch_logs(
user.get("sub", "anonymous"), user.get("sub", "anonymous"),
) )
return result return result
except HTTPException:
raise
except Exception as exc: except Exception as exc:
raise HTTPException(status_code=502, detail=str(exc)) from exc logger.error("Fetch failed", error=str(exc))
raise HTTPException(status_code=502, detail="Failed to fetch audit logs") from exc

View File

@@ -1,4 +1,5 @@
import structlog import structlog
from config import WEBHOOK_CLIENT_SECRET
from fastapi import APIRouter, Request, Response from fastapi import APIRouter, Request, Response
router = APIRouter() router = APIRouter()
@@ -10,9 +11,12 @@ async def graph_webhook(request: Request):
""" """
Receive Microsoft Graph change notifications. Receive Microsoft Graph change notifications.
Handles the validation handshake by echoing validationToken. Handles the validation handshake by echoing validationToken.
Validates clientState on notifications to prevent spoofing.
""" """
validation_token = request.query_params.get("validationToken") validation_token = request.query_params.get("validationToken")
if validation_token: if validation_token:
# Microsoft sends validationToken as a query param during subscription creation.
# Echo it back as plain text to prove endpoint ownership.
return Response(content=validation_token, media_type="text/plain") return Response(content=validation_token, media_type="text/plain")
try: try:
@@ -21,12 +25,26 @@ async def graph_webhook(request: Request):
logger.warning("Invalid webhook payload", error=str(exc)) logger.warning("Invalid webhook payload", error=str(exc))
return Response(status_code=400) return Response(status_code=400)
for notification in body.get("value", []): notifications = body.get("value", [])
if not isinstance(notifications, list):
logger.warning("Invalid webhook payload structure")
return Response(status_code=400)
for notification in notifications:
client_state = notification.get("clientState")
if WEBHOOK_CLIENT_SECRET and client_state != WEBHOOK_CLIENT_SECRET:
logger.warning(
"Graph webhook rejected: invalid clientState",
change_type=notification.get("changeType"),
resource=notification.get("resource"),
)
return Response(status_code=401)
logger.info( logger.info(
"Received Graph notification", "Received Graph notification",
change_type=notification.get("changeType"), change_type=notification.get("changeType"),
resource=notification.get("resource"), resource=notification.get("resource"),
client_state=notification.get("clientState"), client_state=client_state,
) )
return {"status": "accepted"} return {"status": "accepted"}

View File

@@ -1,7 +1,18 @@
from datetime import UTC, datetime """Rule-based alerting for admin operations.
Rules are evaluated during event ingestion. Triggered alerts are stored in MongoDB
and optionally forwarded to a notification channel (webhook, Slack, Teams).
Deduplication: the same rule firing for the same actor within ALERT_DEDUPE_MINUTES
produces only one alert.
"""
from datetime import UTC, datetime, timedelta
import structlog import structlog
from config import ALERT_DEDUPE_MINUTES, ALERT_WEBHOOK_FORMAT, ALERT_WEBHOOK_URL
from database import db from database import db
from pymongo import ASCENDING
logger = structlog.get_logger("aoc.rules") logger = structlog.get_logger("aoc.rules")
rules_collection = db["alert_rules"] rules_collection = db["alert_rules"]
@@ -18,6 +29,13 @@ def evaluate_event(event: dict) -> list[dict]:
rules = load_rules() rules = load_rules()
for rule in rules: for rule in rules:
if _matches(rule, event): if _matches(rule, event):
if _is_duplicate(rule, event):
logger.debug(
"Alert deduplicated",
rule=rule.get("name"),
event_id=event.get("id"),
)
continue
triggered.append(rule) triggered.append(rule)
_create_alert(rule, event) _create_alert(rule, event)
return triggered return triggered
@@ -50,6 +68,9 @@ def _matches(rule: dict, event: dict) -> bool:
return False return False
except Exception: except Exception:
return False return False
if op == "threshold_count":
# Threshold rules are evaluated at query time, not per-event
return False
return True return True
@@ -64,7 +85,22 @@ def _get_nested(obj: dict, path: str):
return val return val
def _is_duplicate(rule: dict, event: dict) -> bool:
"""Check if an alert for this rule + actor was recently created."""
if ALERT_DEDUPE_MINUTES <= 0:
return False
cutoff = (datetime.now(UTC) - timedelta(minutes=ALERT_DEDUPE_MINUTES)).isoformat()
actor = event.get("actor_display") or event.get("actor_upn") or "unknown"
query = {
"rule_id": str(rule.get("_id")),
"actor": actor,
"timestamp": {"$gte": cutoff},
}
return alerts_collection.count_documents(query, limit=1) > 0
def _create_alert(rule: dict, event: dict): def _create_alert(rule: dict, event: dict):
actor = event.get("actor_display") or event.get("actor_upn") or "unknown"
alert = { alert = {
"timestamp": datetime.now(UTC).isoformat(), "timestamp": datetime.now(UTC).isoformat(),
"rule_id": str(rule.get("_id")), "rule_id": str(rule.get("_id")),
@@ -72,10 +108,177 @@ def _create_alert(rule: dict, event: dict):
"severity": rule.get("severity", "medium"), "severity": rule.get("severity", "medium"),
"event_id": event.get("id"), "event_id": event.get("id"),
"event_dedupe_key": event.get("dedupe_key"), "event_dedupe_key": event.get("dedupe_key"),
"actor": actor,
"message": rule.get("message", f"Rule '{rule.get('name')}' triggered"), "message": rule.get("message", f"Rule '{rule.get('name')}' triggered"),
"status": "open", # open | acknowledged | resolved | false_positive
} }
try: try:
alerts_collection.insert_one(alert) alerts_collection.insert_one(alert)
logger.info("Alert created", rule=rule.get("name"), event_id=event.get("id")) logger.info("Alert created", rule=rule.get("name"), event_id=event.get("id"))
except Exception as exc: except Exception as exc:
logger.warning("Failed to create alert", error=str(exc)) logger.warning("Failed to create alert", error=str(exc))
return
# Send notification
if ALERT_WEBHOOK_URL:
try:
from notifications import send_notification
send_notification(
webhook_url=ALERT_WEBHOOK_URL,
format_type=ALERT_WEBHOOK_FORMAT,
rule_name=rule.get("name", "Unnamed rule"),
severity=rule.get("severity", "medium"),
message=rule.get("message", ""),
event=event,
)
except Exception as exc:
logger.warning("Failed to send notification", error=str(exc))
def seed_default_rules():
"""Upsert pre-built admin-ops rule templates. Safe for concurrent startup."""
# One-time cleanup: remove duplicates by name, keep the oldest (_id ascending)
pipeline = [
{"$sort": {"_id": ASCENDING}},
{"$group": {"_id": "$name", "first_id": {"$first": "$_id"}}},
]
seen = {doc["_id"]: doc["first_id"] for doc in rules_collection.aggregate(pipeline)}
for name, keep_id in seen.items():
rules_collection.delete_many({"name": name, "_id": {"$ne": keep_id}})
defaults = [
{
"name": "Failed Conditional Access",
"enabled": True,
"severity": "high",
"message": (
"A Conditional Access policy evaluation failed. "
"This may indicate a sign-in risk or policy misconfiguration."
),
"conditions": [
{"field": "service", "op": "eq", "value": "Directory"},
{"field": "operation", "op": "contains", "value": "ConditionalAccess"},
{"field": "result", "op": "neq", "value": "success"},
],
},
{
"name": "After-Hours Admin Activity",
"enabled": True,
"severity": "medium",
"message": "A privileged operation was performed outside business hours (9 AM 5 PM).",
"conditions": [
{
"field": "service",
"op": "in",
"value": ["Directory", "UserManagement", "GroupManagement", "RoleManagement"],
},
{"field": "timestamp", "op": "after_hours"},
],
},
{
"name": "New Application Registration",
"enabled": True,
"severity": "medium",
"message": (
"A new application was registered in Entra ID. Review for shadow IT or unauthorized integrations."
),
"conditions": [
{"field": "service", "op": "eq", "value": "ApplicationManagement"},
{"field": "operation", "op": "contains", "value": "Add application"},
],
},
{
"name": "Admin Role Assignment",
"enabled": True,
"severity": "high",
"message": "A user was assigned an administrative role. Verify this was expected and authorized.",
"conditions": [
{"field": "service", "op": "eq", "value": "RoleManagement"},
{"field": "operation", "op": "contains", "value": "Add member to role"},
],
},
{
"name": "License Change",
"enabled": True,
"severity": "low",
"message": "A license was assigned or removed from a user. Monitor for unexpected cost changes.",
"conditions": [
{"field": "service", "op": "eq", "value": "License"},
],
},
{
"name": "Bulk User Deletion",
"enabled": True,
"severity": "high",
"message": (
"Multiple users were deleted in a short window. "
"This may indicate a compromised admin account or cleanup activity."
),
"conditions": [
{"field": "service", "op": "in", "value": ["Directory", "UserManagement"]},
{"field": "operation", "op": "contains", "value": "Delete user"},
],
},
{
"name": "Device Compliance Failure",
"enabled": True,
"severity": "medium",
"message": (
"A device failed compliance evaluation. "
"It may no longer meet your organization's security requirements."
),
"conditions": [
{"field": "service", "op": "eq", "value": "Intune"},
{"field": "operation", "op": "contains", "value": "compliance"},
{"field": "result", "op": "neq", "value": "success"},
],
},
{
"name": "Exchange Transport Rule Change",
"enabled": True,
"severity": "high",
"message": "An Exchange transport rule was modified. This could affect mail flow or security filtering.",
"conditions": [
{"field": "service", "op": "eq", "value": "Exchange"},
{"field": "operation", "op": "contains", "value": "Transport rule"},
],
},
{
"name": "Service Principal Credential Added",
"enabled": True,
"severity": "high",
"message": "A new secret or certificate was added to a service principal. Verify this was expected.",
"conditions": [
{"field": "service", "op": "eq", "value": "ApplicationManagement"},
{"field": "operation", "op": "contains", "value": "Add service principal credentials"},
],
},
{
"name": "External Sharing Enabled",
"enabled": True,
"severity": "medium",
"message": (
"External sharing settings were modified on a SharePoint site or team. Review for data exposure risk."
),
"conditions": [
{"field": "service", "op": "in", "value": ["SharePoint", "Teams"]},
{"field": "operation", "op": "contains", "value": "Sharing"},
],
},
]
inserted = 0
for rule in defaults:
try:
result = rules_collection.replace_one(
{"name": rule["name"]},
rule,
upsert=True,
)
if result.upserted_id:
inserted += 1
except Exception as exc:
logger.warning("Failed to seed rule", rule=rule["name"], error=str(exc))
if inserted:
logger.info("Default admin-ops rules seeded", inserted=inserted, total=len(defaults))

View File

@@ -59,6 +59,7 @@ def test_evaluate_event_creates_alert(monkeypatch):
inserted["doc"] = doc inserted["doc"] = doc
monkeypatch.setattr(alerts_collection, "insert_one", mock_insert) monkeypatch.setattr(alerts_collection, "insert_one", mock_insert)
monkeypatch.setattr(alerts_collection, "count_documents", lambda *args, **kwargs: 0)
event = {"id": "e1", "operation": "Add user", "timestamp": datetime.now(UTC).isoformat(), "dedupe_key": "dk1"} event = {"id": "e1", "operation": "Add user", "timestamp": datetime.now(UTC).isoformat(), "dedupe_key": "dk1"}
triggered = evaluate_event(event) triggered = evaluate_event(event)

View File

@@ -3,8 +3,7 @@ services:
image: valkey/valkey:8-alpine image: valkey/valkey:8-alpine
container_name: aoc-redis container_name: aoc-redis
restart: always restart: always
ports: # Ports not exposed to host; backend and worker connect via Docker network
- "6379:6379"
volumes: volumes:
- redis_data:/data - redis_data:/data
@@ -12,8 +11,7 @@ services:
image: mongo:7 image: mongo:7
container_name: aoc-mongo container_name: aoc-mongo
restart: always restart: always
ports: # Ports not exposed to host; backend and worker connect via Docker network
- "27017:27017"
environment: environment:
MONGO_INITDB_ROOT_USERNAME: ${MONGO_ROOT_USERNAME} MONGO_INITDB_ROOT_USERNAME: ${MONGO_ROOT_USERNAME}
MONGO_INITDB_ROOT_PASSWORD: ${MONGO_ROOT_PASSWORD} MONGO_INITDB_ROOT_PASSWORD: ${MONGO_ROOT_PASSWORD}