When a query matches >50 events, the LLM now receives: - Aggregated counts by service, operation, result, and actor - A list of failures (up to 10) - The 50 most recent raw events as samples This scales to thousands of events without blowing the token budget or losing signal. The LLM gets a bird's-eye view plus concrete examples. Also updates the system prompt to handle both individual event lists and aggregated overviews correctly.
Admin Operations Center (AOC)
FastAPI microservice that ingests Microsoft Entra (Azure AD) and other admin audit logs into MongoDB, dedupes them, and exposes a UI/API to fetch, search, and review events.
Components
- FastAPI app under
backend/with routes to fetch audit logs and list stored events. - MongoDB for persistence (provisioned via Docker Compose).
- Microsoft Graph client (client credentials) for retrieving directory audit events and Intune audit events.
- Office 365 Management Activity API client for Exchange/SharePoint/Teams admin audit logs.
- Frontend served from the backend for filtering/searching events and viewing raw entries.
- Optional OIDC bearer auth (Entra) to protect the API/UI and gate access by roles/groups.
Prerequisites (macOS)
- Python 3.11
- Docker Desktop (for the quickest start) or a local MongoDB instance
- An Entra app registration with Application permission
AuditLog.Read.Alland admin consent granted- Also required to fetch other sources:
https://manage.office.com/.default(Audit API) withActivityFeed.Read/ActivityFeed.ReadDlp(built into the app registration’s API permissions for Office 365 Management APIs)- Intune audit:
DeviceManagementConfiguration.Read.All(or broader) for/deviceManagement/auditEvents
- Optional API protection: configure
AUTH_ENABLED=trueand setAUTH_TENANT_ID/AUTH_CLIENT_ID(the audience) plus allowed roles/groups.
- Also required to fetch other sources:
Configuration
Create a .env file at the repo root (copy .env.example) and fill in your Microsoft Graph app credentials. The provided MONGO_URI works with the bundled MongoDB container; change it if you use a different Mongo instance.
cp .env.example .env
# edit .env to add TENANT_ID, CLIENT_ID, CLIENT_SECRET (and MONGO_URI if needed)
# optional: enable auth & periodic fetch
# AUTH_ENABLED=true
# AUTH_TENANT_ID=...
# AUTH_CLIENT_ID=...
# AUTH_ALLOWED_ROLES=Admins,SecurityOps
# ENABLE_PERIODIC_FETCH=true
# FETCH_INTERVAL_MINUTES=60
# Optional: data retention (auto-expire old events via MongoDB TTL)
# RETENTION_DAYS=90
# Optional: CORS origins if the frontend is served separately
# CORS_ORIGINS=http://localhost:3000,https://app.example.com
Run with Docker Compose (recommended)
docker compose up --build
- API: http://localhost:8000
- Frontend: http://localhost:8000
- Health: http://localhost:8000/health
- Mongo: localhost:27017 (root/example)
Run locally without Docker
-
Start MongoDB (e.g. with Docker):
docker run --rm -p 27017:27017 -e MONGO_INITDB_ROOT_USERNAME=root -e MONGO_INITDB_ROOT_PASSWORD=example mongo:7 -
Prepare the backend environment:
cd backend
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
export $(cat ../.env | xargs) # or set env vars manually
uvicorn main:app --reload --host 0.0.0.0 --port 8000
API
GET /health— health check with MongoDB connectivity status.GET /metrics— Prometheus metrics for request latency, fetch volume, and errors.GET /api/fetch-audit-logs— pulls the last 7 days by default (override with?hours=N, capped to 30 days) of:- Entra directory audit logs (
/auditLogs/directoryAudits) - Exchange/SharePoint/Teams admin audits (via Office 365 Management Activity API)
- Intune audit logs (
/deviceManagement/auditEvents) Dedupes on a stable key (source id or timestamp/category/operation/target). Returns count and per-source warnings. - Incremental fetch: each source remembers its last successful fetch time in MongoDB (
watermarkscollection). Subsequent calls fetch only new events since the watermark. - Alerting: if
ALERTS_ENABLED=true, events are evaluated against stored rules during ingestion. - SIEM export: if
SIEM_ENABLED=true, each ingested event is forwarded toSIEM_WEBHOOK_URL.
- Entra directory audit logs (
GET /api/events— list stored events with filters:service,actor,operation,result,start,end,search(free text over raw/summary/actor/targets)- Pagination:
cursor-based (page_sizedefaults to 50, max 500). Passcursorfromnext_cursorto paginate forward.
GET /api/filter-options— best-effort distinct values for services, operations, results, actors (used by UI dropdowns).POST /api/webhooks/graph— receive Microsoft Graph change notifications. EchoesvalidationTokenwhen present.GET /api/source-health— last fetch status for each ingestion source (directory,unified,intune).PATCH /api/events/{id}/tags— update tags on an event (e.g.,investigating,false_positive).POST /api/events/{id}/comments— add a comment to an event.GET /api/rules— list alert rules.POST /api/rules— create an alert rule.PUT /api/rules/{id}— update an alert rule.DELETE /api/rules/{id}— delete an alert rule.
Stored document shape (collection micro_soc.events):
{
"id": "...", // original source id
"timestamp": "...", // activityDateTime
"service": "...", // category
"operation": "...", // activityDisplayName
"result": "...",
"actor_display": "...", // resolved user/app name
"target_displays": [ ... ],
"display_summary": "...",
"dedupe_key": "...", // used for upserts
"actor": { ... }, // initiatedBy
"targets": [ ... ], // targetResources
"raw": { ... }, // full source event
"raw_text": "..." // raw as string for text search
}
Development
Linting and formatting
We use ruff for linting and formatting.
cd backend
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt -r requirements-dev.txt
ruff check ..
ruff format ..
Running tests
cd backend
pytest -q
Quick smoke tests
With the server running:
curl http://localhost:8000/health
curl http://localhost:8000/api/events
curl http://localhost:8000/api/fetch-audit-logs
- Visit the UI at http://localhost:8000 to filter by user/service/action/result/time, search raw text, paginate, and view raw events.
Maintenance (Dockerized)
Use the backend image so you don’t need a local venv:
# ensure Mongo + backend network are up
docker compose up -d mongo
# re-run enrichment/normalization on stored events (uses .env for Graph/Mongo)
docker compose run --rm backend python maintenance.py renormalize --limit 500
# deduplicate existing events (optional)
docker compose run --rm backend python maintenance.py dedupe
Omit --limit to process all events. You can also run commands inside a running backend container with docker compose exec backend ....
Notes / Troubleshooting
- Ensure
TENANT_ID,CLIENT_ID, andCLIENT_SECRETmatch an app registration withAuditLog.Read.All(application) permission and admin consent. - Additional permissions: Office 365 Management Activity (
ActivityFeed.Read), and Intune audit (DeviceManagementConfiguration.Read.All). - Auth: if
AUTH_ENABLED=true, issued tokens must be fromAUTH_TENANT_ID, audience =AUTH_CLIENT_ID; access is granted if roles or groups overlapAUTH_ALLOWED_ROLES/AUTH_ALLOWED_GROUPS(if set). - Backfill limits: Management Activity API typically exposes ~7 days of history via API (longer if your tenant has extended/Advanced Audit retention). Directory/Intune audit retention follows your tenant policy (commonly 30–90 days, longer with Advanced Audit).
- If you change Mongo credentials/ports, update
MONGO_URIin.env(Docker Compose passes it through to the backend). - The service uses the
micro_socdatabase andeventscollection by default; adjust inbackend/config.pyif needed.