feat: initial KosmoConnect platform v0.1
Includes: - Backend services: ingestion (:8001), weather API (:8002), gateway (:8003), billing (:8004) with BTCPay integration - Shared asyncpg pool, TimescaleDB hypertable, Redis, Mosquitto MQTT - React frontend: Dashboard (MapLibre) and Messaging (chat UI) - Bridge daemon for Pi + Meshtastic (Serial/TCP T-Deck support) - Production Docker Compose, Nginx reverse proxy, ops scripts - DEPLOY.md with step-by-step deployment guide
This commit is contained in:
5
backend/.env.example
Normal file
5
backend/.env.example
Normal file
@@ -0,0 +1,5 @@
|
||||
# Copy this file to .env and adjust as needed
|
||||
DATABASE_URL=postgresql://kosmo:kosmo_dev_pass@localhost:5432/kosmoconnect
|
||||
MQTT_HOST=localhost
|
||||
MQTT_PORT=1883
|
||||
MQTT_TOPIC=kosmo/ingest/enviro
|
||||
18
backend/.env.prod.example
Normal file
18
backend/.env.prod.example
Normal file
@@ -0,0 +1,18 @@
|
||||
# Copy this file to .env and fill in real values before deploying
|
||||
|
||||
# Database
|
||||
POSTGRES_USER=kosmo
|
||||
POSTGRES_PASSWORD=CHANGE_ME_TO_STRONG_PASSWORD
|
||||
POSTGRES_DB=kosmoconnect
|
||||
DATABASE_URL=postgresql://kosmo:CHANGE_ME_TO_STRONG_PASSWORD@timescaledb:5432/kosmoconnect
|
||||
|
||||
# MQTT
|
||||
MQTT_HOST=mosquitto
|
||||
MQTT_PORT=1883
|
||||
MQTT_TOPIC=kosmo/ingest/enviro
|
||||
|
||||
# BTCPay Server (Church of Kosmo)
|
||||
BTCPAY_URL=https://pay.cqre.net
|
||||
BTCPAY_API_KEY=your_btcpay_api_key
|
||||
BTCPAY_STORE_ID=your_btcpay_store_id
|
||||
WEBHOOK_SECRET=your_webhook_secret
|
||||
21
backend/Dockerfile
Normal file
21
backend/Dockerfile
Normal file
@@ -0,0 +1,21 @@
|
||||
# KosmoConnect Backend Services
|
||||
# Build context: backend/
|
||||
FROM python:3.13-slim
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install build dependencies for packages that may need compilation
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends gcc && rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Copy requirements and install
|
||||
COPY requirements.txt .
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
|
||||
# Copy application code
|
||||
COPY . .
|
||||
|
||||
ENV PYTHONPATH=/app:/app/shared
|
||||
ENV PYTHONUNBUFFERED=1
|
||||
|
||||
# Default to API service; override CMD per service
|
||||
CMD ["uvicorn", "api.src.main:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||
93
backend/README.md
Normal file
93
backend/README.md
Normal file
@@ -0,0 +1,93 @@
|
||||
# KosmoConnect Backend
|
||||
|
||||
This directory contains the cloud backend services for KosmoConnect.
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Start Infrastructure Services
|
||||
|
||||
You need Docker running on your machine.
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
This starts:
|
||||
- **TimescaleDB** on port `5432`
|
||||
- **Redis** on port `6379`
|
||||
- **RabbitMQ** on port `5672` (management UI on `15672`)
|
||||
- **Mosquitto MQTT** on port `1883`
|
||||
|
||||
The first time TimescaleDB starts, it will automatically run the migration in `migrations/001_initial_schema.sql`.
|
||||
|
||||
### 2. Install Python Dependencies
|
||||
|
||||
```bash
|
||||
python3 -m venv .venv
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### 3. Run the Services
|
||||
|
||||
In terminal 1:
|
||||
```bash
|
||||
./run-dev.sh ingestion
|
||||
```
|
||||
|
||||
In terminal 2:
|
||||
```bash
|
||||
./run-dev.sh api
|
||||
```
|
||||
|
||||
- API docs: http://localhost:8002/docs
|
||||
- Ingestion health: http://localhost:8001/health
|
||||
|
||||
### 4. Simulate Data (No Hardware Needed)
|
||||
|
||||
In terminal 3:
|
||||
```bash
|
||||
cd ..
|
||||
python3 -m venv .venv
|
||||
source .venv/bin/activate
|
||||
pip install -r backend/requirements.txt
|
||||
python3 scripts/simulate-bridge.py --interval 5
|
||||
```
|
||||
|
||||
This publishes fake environmental readings every 5 seconds. The ingestion service will pick them up and write them to TimescaleDB. You can then query the API:
|
||||
|
||||
```bash
|
||||
curl "http://localhost:8002/api/v1/weather/latest"
|
||||
```
|
||||
|
||||
## Service Architecture
|
||||
|
||||
| Service | Port | Responsibility |
|
||||
|---------|------|----------------|
|
||||
| API | 8002 | REST API for dashboard and web clients |
|
||||
| Ingestion | 8001 | Subscribes to MQTT, writes sensor data to TimescaleDB |
|
||||
| Gateway | 8003 | Web-to-mesh message queue, delivery tracking, and subscription enforcement |
|
||||
| Billing | 8004 | BTCPay Server integration for subscriptions and invoices |
|
||||
|
||||
## Database Schema
|
||||
|
||||
- **nodes**: Registry of all enviro-nodes and infrastructure nodes
|
||||
- **enviro_readings**: Time-series hypertable for sensor data
|
||||
- **mesh_messages**: Delivery tracking for gateway messages
|
||||
- **users / subscriptions / allowed_nodes**: Subscriber management
|
||||
|
||||
## Environment Variables
|
||||
|
||||
Copy `.env.example` to `.env` and customize:
|
||||
|
||||
```bash
|
||||
cp .env.example .env
|
||||
```
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `DATABASE_URL` | `postgresql://kosmo:kosmo_dev_pass@localhost:5432/kosmoconnect` | TimescaleDB connection |
|
||||
| `MQTT_HOST` | `localhost` | MQTT broker host |
|
||||
| `MQTT_PORT` | `1883` | MQTT broker port |
|
||||
| `MQTT_TOPIC` | `kosmo/ingest/enviro` | Topic for enviro data |
|
||||
144
backend/api/src/main.py
Normal file
144
backend/api/src/main.py
Normal file
@@ -0,0 +1,144 @@
|
||||
import os
|
||||
from contextlib import asynccontextmanager
|
||||
from datetime import datetime, timezone
|
||||
from typing import List, Optional
|
||||
|
||||
import asyncpg
|
||||
from fastapi import FastAPI, Query
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
|
||||
import sys
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), "../../shared"))
|
||||
|
||||
from db import get_pool
|
||||
from models import EnviroReading, Node
|
||||
|
||||
pool = None
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
global pool
|
||||
pool = await get_pool()
|
||||
yield
|
||||
await pool.close()
|
||||
|
||||
|
||||
app = FastAPI(title="KosmoConnect API", lifespan=lifespan)
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=["*"],
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
|
||||
@app.get("/health")
|
||||
async def health():
|
||||
return {"status": "ok", "service": "api"}
|
||||
|
||||
|
||||
@app.get("/api/v1/weather/latest")
|
||||
async def get_latest_readings(node_id: Optional[str] = Query(None)):
|
||||
async with pool.acquire() as conn:
|
||||
if node_id:
|
||||
rows = await conn.fetch(
|
||||
"""
|
||||
SELECT DISTINCT ON (node_id)
|
||||
time, node_id, temperature_c, humidity_percent, pressure_pa,
|
||||
wind_speed_ms, wind_direction, pm25_ugm3, pm10_ugm3,
|
||||
gas_resistance_kohm, battery_voltage, solar_voltage
|
||||
FROM enviro_readings
|
||||
WHERE node_id = $1
|
||||
ORDER BY node_id, time DESC
|
||||
""",
|
||||
node_id,
|
||||
)
|
||||
else:
|
||||
rows = await conn.fetch(
|
||||
"""
|
||||
SELECT DISTINCT ON (node_id)
|
||||
time, node_id, temperature_c, humidity_percent, pressure_pa,
|
||||
wind_speed_ms, wind_direction, pm25_ugm3, pm10_ugm3,
|
||||
gas_resistance_kohm, battery_voltage, solar_voltage
|
||||
FROM enviro_readings
|
||||
ORDER BY node_id, time DESC
|
||||
"""
|
||||
)
|
||||
return {"data": [dict(r) for r in rows]}
|
||||
|
||||
|
||||
@app.get("/api/v1/weather/history")
|
||||
async def get_history(
|
||||
node_id: str = Query(...),
|
||||
start: datetime = Query(...),
|
||||
end: datetime = Query(...),
|
||||
interval: Optional[str] = Query("raw", enum=["raw", "1h", "1d"]),
|
||||
):
|
||||
async with pool.acquire() as conn:
|
||||
if interval == "raw":
|
||||
rows = await conn.fetch(
|
||||
"""
|
||||
SELECT time, node_id, temperature_c, humidity_percent, pressure_pa,
|
||||
wind_speed_ms, wind_direction, pm25_ugm3, pm10_ugm3,
|
||||
gas_resistance_kohm, battery_voltage, solar_voltage
|
||||
FROM enviro_readings
|
||||
WHERE node_id = $1 AND time >= $2 AND time <= $3
|
||||
ORDER BY time DESC
|
||||
""",
|
||||
node_id,
|
||||
start,
|
||||
end,
|
||||
)
|
||||
else:
|
||||
bucket = "1 hour" if interval == "1h" else "1 day"
|
||||
rows = await conn.fetch(
|
||||
f"""
|
||||
SELECT
|
||||
time_bucket($4, time) AS time,
|
||||
node_id,
|
||||
avg(temperature_c) AS temperature_c,
|
||||
avg(humidity_percent) AS humidity_percent,
|
||||
avg(pressure_pa) AS pressure_pa,
|
||||
avg(wind_speed_ms) AS wind_speed_ms,
|
||||
avg(wind_direction)::smallint AS wind_direction,
|
||||
avg(pm25_ugm3) AS pm25_ugm3,
|
||||
avg(pm10_ugm3) AS pm10_ugm3,
|
||||
avg(gas_resistance_kohm) AS gas_resistance_kohm,
|
||||
avg(battery_voltage) AS battery_voltage,
|
||||
avg(solar_voltage) AS solar_voltage
|
||||
FROM enviro_readings
|
||||
WHERE node_id = $1 AND time >= $2 AND time <= $3
|
||||
GROUP BY time_bucket($4, time), node_id
|
||||
ORDER BY time DESC
|
||||
""",
|
||||
node_id,
|
||||
start,
|
||||
end,
|
||||
bucket,
|
||||
)
|
||||
return {"data": [dict(r) for r in rows]}
|
||||
|
||||
|
||||
@app.get("/api/v1/nodes")
|
||||
async def get_nodes():
|
||||
async with pool.acquire() as conn:
|
||||
rows = await conn.fetch(
|
||||
"""
|
||||
SELECT
|
||||
id::text,
|
||||
mesh_node_id,
|
||||
name,
|
||||
lat,
|
||||
lon,
|
||||
hardware_revision,
|
||||
installed_at,
|
||||
last_seen,
|
||||
is_active
|
||||
FROM nodes
|
||||
WHERE is_active = true
|
||||
ORDER BY last_seen DESC NULLS LAST
|
||||
"""
|
||||
)
|
||||
return {"data": [dict(r) for r in rows]}
|
||||
116
backend/billing/README.md
Normal file
116
backend/billing/README.md
Normal file
@@ -0,0 +1,116 @@
|
||||
# KosmoConnect Billing Service
|
||||
|
||||
Integrates with **BTCPay Server** (`pay.cqre.net`) for subscription payments.
|
||||
|
||||
## What It Does
|
||||
|
||||
- **Invoice Creation**: Generates BTCPay invoices for plan purchases (Wanderer, Guardian, Sanctuary)
|
||||
- **Webhook Handling**: Listens to BTCPay Server webhooks and updates subscription status on payment
|
||||
- **Subscription Activation**: On `InvoiceSettled`, extends the user's active subscription in PostgreSQL
|
||||
- **Invoice History**: Lets users view their past invoices and payment status
|
||||
|
||||
## Why BTCPay Server?
|
||||
|
||||
The Church of Kosmo operates its own payment infrastructure at `pay.cqre.net`. BTCPay Server is a self-hosted, open-source Bitcoin payment processor. It enables sovereign, censorship-resistant payments without relying on third-party card processors.
|
||||
|
||||
## Plan Pricing
|
||||
|
||||
| Plan | Monthly Price | Messages | Scope |
|
||||
|------|---------------|----------|-------|
|
||||
| **Wanderer** | $5.00 | 50/month | Any node on the mesh |
|
||||
| **Guardian** | $12.00 | 500/month | Only whitelisted nodes |
|
||||
| **Sanctuary** | $50.00 | Unlimited | Any node + API/webhooks |
|
||||
|
||||
Prices are denominated in `USD` and paid via BTCPay Server (settled in BTC or Lightning, depending on store configuration).
|
||||
|
||||
## Running Locally
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
export BTCPAY_URL=https://pay.cqre.net
|
||||
export BTCPAY_API_KEY=your_api_key_here
|
||||
export BTCPAY_STORE_ID=your_store_id_here
|
||||
export WEBHOOK_SECRET=your_webhook_secret_here
|
||||
./run-dev.sh billing
|
||||
```
|
||||
|
||||
## BTCPay Server Setup Checklist
|
||||
|
||||
1. **Create an API Key** in your BTCPay Server instance with the following permissions:
|
||||
- `Create invoice`
|
||||
- `View invoices`
|
||||
- `Modify store webhooks`
|
||||
|
||||
2. **Create a Webhook** in your BTCPay store pointing to:
|
||||
```
|
||||
https://your-kosmoconnect-instance/api/v1/billing/webhooks/btcpay
|
||||
```
|
||||
Enable events:
|
||||
- `Invoice created`
|
||||
- `Invoice received payment`
|
||||
- `Invoice processing`
|
||||
- `Invoice expired`
|
||||
- `Invoice settled`
|
||||
- `Invoice invalid`
|
||||
|
||||
3. **Set the Webhook Secret** in the billing service (`WEBHOOK_SECRET`) to verify webhook signatures.
|
||||
|
||||
## API Endpoints
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| POST | `/api/v1/billing/invoices` | Create a new invoice for a plan |
|
||||
| GET | `/api/v1/billing/invoices` | List user's invoices |
|
||||
| GET | `/api/v1/billing/invoices/{invoice_id}` | Get invoice details + sync status |
|
||||
| POST | `/api/v1/billing/webhooks/btcpay` | BTCPay webhook receiver |
|
||||
|
||||
## Example: Create an Invoice
|
||||
|
||||
```bash
|
||||
curl -X POST http://localhost:8004/api/v1/billing/invoices \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-User-ID: 11111111-1111-1111-1111-111111111111" \
|
||||
-d '{"plan_type": "wanderer", "redirect_url": "https://kosmoconnect.local/thank-you"}'
|
||||
```
|
||||
|
||||
Response:
|
||||
```json
|
||||
{
|
||||
"invoice_id": "...",
|
||||
"checkout_url": "https://pay.cqre.net/i/...",
|
||||
"amount": 5.0,
|
||||
"currency": "USD",
|
||||
"plan_type": "wanderer"
|
||||
}
|
||||
```
|
||||
|
||||
## Webhook Payload
|
||||
|
||||
The billing service expects standard BTCPay Server webhook payloads. On `InvoiceSettled`, it:
|
||||
|
||||
1. Looks up the invoice in `btcpay_invoices`
|
||||
2. Deactivates the user's previous subscription
|
||||
3. Inserts a new active subscription with `valid_from = NOW()` and `valid_until = NOW() + 30 days`
|
||||
4. Resets `messages_used` to `0`
|
||||
|
||||
## Testing Webhooks Locally
|
||||
|
||||
If you can't expose localhost to BTCPay, you can simulate a webhook:
|
||||
|
||||
```bash
|
||||
curl -X POST http://localhost:8004/api/v1/billing/webhooks/btcpay \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"type": "InvoiceSettled",
|
||||
"invoiceId": "your-test-invoice-id",
|
||||
"status": "Settled"
|
||||
}'
|
||||
```
|
||||
|
||||
**Note:** Webhook signature verification is skipped if `WEBHOOK_SECRET` is not set.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
- **"BTCPay not configured"**: Set `BTCPAY_URL`, `BTCPAY_API_KEY`, and `BTCPAY_STORE_ID` environment variables.
|
||||
- **403 on webhook**: Check that `WEBHOOK_SECRET` matches the secret configured in BTCPay Server.
|
||||
- **Invoice not found on webhook**: Ensure the invoice was created through the billing service (so the `btcpay_invoice_id` exists in the database).
|
||||
0
backend/billing/src/__init__.py
Normal file
0
backend/billing/src/__init__.py
Normal file
79
backend/billing/src/btcpay_client.py
Normal file
79
backend/billing/src/btcpay_client.py
Normal file
@@ -0,0 +1,79 @@
|
||||
import base64
|
||||
import hashlib
|
||||
import hmac
|
||||
import logging
|
||||
from typing import Optional
|
||||
|
||||
import httpx
|
||||
|
||||
from . import config
|
||||
|
||||
logger = logging.getLogger("billing.btcpay")
|
||||
|
||||
|
||||
class BTCPayClient:
|
||||
def __init__(self):
|
||||
self.base_url = config.BTCPAY_URL.rstrip("/")
|
||||
self.api_key = config.BTCPAY_API_KEY
|
||||
self.store_id = config.BTCPAY_STORE_ID
|
||||
self.client = httpx.AsyncClient(
|
||||
base_url=self.base_url,
|
||||
headers={
|
||||
"Authorization": f"token {self.api_key}",
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
timeout=30.0,
|
||||
)
|
||||
|
||||
async def create_invoice(
|
||||
self,
|
||||
amount: float,
|
||||
currency: str,
|
||||
order_id: str,
|
||||
metadata: dict,
|
||||
checkout_desc: Optional[str] = None,
|
||||
) -> dict:
|
||||
payload = {
|
||||
"amount": amount,
|
||||
"currency": currency,
|
||||
"metadata": {
|
||||
"orderId": order_id,
|
||||
**metadata,
|
||||
},
|
||||
"checkout": {
|
||||
"redirectURL": metadata.get("redirect_url", ""),
|
||||
"redirectAutomatically": True,
|
||||
},
|
||||
}
|
||||
if checkout_desc:
|
||||
payload["metadata"]["itemDesc"] = checkout_desc
|
||||
|
||||
url = f"/api/v1/stores/{self.store_id}/invoices"
|
||||
resp = await self.client.post(url, json=payload)
|
||||
resp.raise_for_status()
|
||||
return resp.json()
|
||||
|
||||
async def get_invoice(self, invoice_id: str) -> dict:
|
||||
url = f"/api/v1/stores/{self.store_id}/invoices/{invoice_id}"
|
||||
resp = await self.client.get(url)
|
||||
resp.raise_for_status()
|
||||
return resp.json()
|
||||
|
||||
def verify_webhook(self, body: bytes, signature_header: str) -> bool:
|
||||
"""Verify BTCPay Server webhook signature using HMAC-SHA256."""
|
||||
if not config.WEBHOOK_SECRET:
|
||||
logger.warning("WEBHOOK_SECRET not set; skipping webhook verification")
|
||||
return True
|
||||
|
||||
# BTCPay sends signature as "sha256=<hex>"
|
||||
expected_prefix = "sha256="
|
||||
if not signature_header.startswith(expected_prefix):
|
||||
return False
|
||||
|
||||
received_sig = signature_header[len(expected_prefix):]
|
||||
computed = hmac.new(
|
||||
config.WEBHOOK_SECRET.encode(),
|
||||
body,
|
||||
hashlib.sha256,
|
||||
).hexdigest()
|
||||
return hmac.compare_digest(received_sig, computed)
|
||||
35
backend/billing/src/config.py
Normal file
35
backend/billing/src/config.py
Normal file
@@ -0,0 +1,35 @@
|
||||
import os
|
||||
|
||||
BTCPAY_URL = os.getenv("BTCPAY_URL", "https://pay.cqre.net")
|
||||
BTCPAY_API_KEY = os.getenv("BTCPAY_API_KEY", "")
|
||||
BTCPAY_STORE_ID = os.getenv("BTCPAY_STORE_ID", "")
|
||||
|
||||
# Plan configuration: monthly price in USD (or BTC if you prefer)
|
||||
PLANS = {
|
||||
"wanderer": {
|
||||
"name": "Wanderer",
|
||||
"price": 5.00,
|
||||
"currency": "USD",
|
||||
"message_quota": 50,
|
||||
"duration_days": 30,
|
||||
"scope": "network",
|
||||
},
|
||||
"guardian": {
|
||||
"name": "Guardian",
|
||||
"price": 12.00,
|
||||
"currency": "USD",
|
||||
"message_quota": 500,
|
||||
"duration_days": 30,
|
||||
"scope": "node",
|
||||
},
|
||||
"sanctuary": {
|
||||
"name": "Sanctuary",
|
||||
"price": 50.00,
|
||||
"currency": "USD",
|
||||
"message_quota": None,
|
||||
"duration_days": 30,
|
||||
"scope": "network",
|
||||
},
|
||||
}
|
||||
|
||||
WEBHOOK_SECRET = os.getenv("WEBHOOK_SECRET", "")
|
||||
274
backend/billing/src/main.py
Normal file
274
backend/billing/src/main.py
Normal file
@@ -0,0 +1,274 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
KosmoConnect Billing Service
|
||||
|
||||
Integrates with BTCPay Server (pay.cqre.net) for subscription payments.
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import sys
|
||||
import uuid
|
||||
from contextlib import asynccontextmanager
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from typing import Optional
|
||||
|
||||
from fastapi import FastAPI, Header, HTTPException, Request
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), "../../shared"))
|
||||
|
||||
from db import get_pool
|
||||
from billing.src.btcpay_client import BTCPayClient
|
||||
from billing.src.models import CreateInvoiceRequest, CreateInvoiceResponse
|
||||
import billing.src.config as config
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger("billing")
|
||||
|
||||
pool = None
|
||||
btcpay: Optional[BTCPayClient] = None
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Subscription Management
|
||||
# ============================================================
|
||||
async def activate_subscription(user_id: str, plan_type: str, invoice_id: str):
|
||||
plan = config.PLANS.get(plan_type)
|
||||
if not plan:
|
||||
logger.error("Unknown plan type: %s", plan_type)
|
||||
return
|
||||
|
||||
duration = timedelta(days=plan["duration_days"])
|
||||
valid_from = datetime.now(timezone.utc)
|
||||
valid_until = valid_from + duration
|
||||
|
||||
async with pool.acquire() as conn:
|
||||
# Deactivate previous subscriptions for this user
|
||||
await conn.execute(
|
||||
"UPDATE subscriptions SET is_active = false WHERE user_id = $1",
|
||||
user_id,
|
||||
)
|
||||
|
||||
await conn.execute(
|
||||
"""
|
||||
INSERT INTO subscriptions (
|
||||
id, user_id, plan_type, btcpay_invoice_id, message_quota,
|
||||
messages_used, valid_from, valid_until, is_active
|
||||
) VALUES ($1, $2, $3, $4, $5, 0, $6, $7, true)
|
||||
""",
|
||||
uuid.uuid4(),
|
||||
user_id,
|
||||
plan_type,
|
||||
invoice_id,
|
||||
plan["message_quota"],
|
||||
valid_from,
|
||||
valid_until,
|
||||
)
|
||||
logger.info("Activated %s subscription for user %s until %s", plan_type, user_id, valid_until)
|
||||
|
||||
|
||||
async def handle_invoice_webhook(invoice_id: str, status: str):
|
||||
async with pool.acquire() as conn:
|
||||
row = await conn.fetchrow(
|
||||
"SELECT * FROM btcpay_invoices WHERE btcpay_invoice_id = $1",
|
||||
invoice_id,
|
||||
)
|
||||
if not row:
|
||||
logger.warning("Received webhook for unknown invoice %s", invoice_id)
|
||||
return
|
||||
|
||||
db_status = status.capitalize() if status else "Pending"
|
||||
settled_at = None
|
||||
if status in ("Settled", "Complete"):
|
||||
db_status = "Settled"
|
||||
settled_at = datetime.now(timezone.utc)
|
||||
await activate_subscription(row["user_id"], row["plan_type"], invoice_id)
|
||||
elif status == "Expired":
|
||||
db_status = "Expired"
|
||||
elif status == "Invalid":
|
||||
db_status = "Invalid"
|
||||
|
||||
async with pool.acquire() as conn:
|
||||
await conn.execute(
|
||||
"""
|
||||
UPDATE btcpay_invoices
|
||||
SET status = $1, settled_at = COALESCE($2, settled_at), updated_at = NOW()
|
||||
WHERE btcpay_invoice_id = $3
|
||||
""",
|
||||
db_status,
|
||||
settled_at,
|
||||
invoice_id,
|
||||
)
|
||||
|
||||
|
||||
# ============================================================
|
||||
# FastAPI App
|
||||
# ============================================================
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
global pool, btcpay
|
||||
pool = await get_pool()
|
||||
btcpay = BTCPayClient()
|
||||
yield
|
||||
await pool.close()
|
||||
await btcpay.client.aclose()
|
||||
|
||||
|
||||
app = FastAPI(title="KosmoConnect Billing Service", lifespan=lifespan)
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=["*"],
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
|
||||
@app.get("/health")
|
||||
async def health():
|
||||
return {"status": "ok", "service": "billing"}
|
||||
|
||||
|
||||
@app.post("/api/v1/billing/invoices", status_code=201)
|
||||
async def create_invoice(req: CreateInvoiceRequest, x_user_id: Optional[str] = Header(None)):
|
||||
if not x_user_id:
|
||||
raise HTTPException(status_code=401, detail="Missing X-User-ID header")
|
||||
|
||||
plan = config.PLANS.get(req.plan_type)
|
||||
if not plan:
|
||||
raise HTTPException(status_code=400, detail="Invalid plan type")
|
||||
|
||||
if not btcpay.store_id or not btcpay.api_key:
|
||||
raise HTTPException(status_code=503, detail="BTCPay not configured")
|
||||
|
||||
order_id = f"kosmo-{x_user_id}-{req.plan_type}-{uuid.uuid4().hex[:8]}"
|
||||
try:
|
||||
invoice = await btcpay.create_invoice(
|
||||
amount=plan["price"],
|
||||
currency=plan["currency"],
|
||||
order_id=order_id,
|
||||
metadata={
|
||||
"user_id": x_user_id,
|
||||
"plan_type": req.plan_type,
|
||||
"redirect_url": req.redirect_url or "",
|
||||
},
|
||||
checkout_desc=f"KosmoConnect {plan['name']} Plan",
|
||||
)
|
||||
except Exception as e:
|
||||
logger.exception("BTCPay invoice creation failed: %s", e)
|
||||
raise HTTPException(status_code=502, detail="Failed to create invoice with BTCPay")
|
||||
|
||||
async with pool.acquire() as conn:
|
||||
await conn.execute(
|
||||
"""
|
||||
INSERT INTO btcpay_invoices (
|
||||
user_id, btcpay_invoice_id, store_id, plan_type,
|
||||
amount, currency, status, checkout_url, metadata
|
||||
) VALUES ($1, $2, $3, $4, $5, $6, 'Pending', $7, $8)
|
||||
""",
|
||||
x_user_id,
|
||||
invoice["id"],
|
||||
btcpay.store_id,
|
||||
req.plan_type,
|
||||
plan["price"],
|
||||
plan["currency"],
|
||||
invoice.get("checkoutLink") or invoice.get("checkoutUrl", ""),
|
||||
json.dumps({"order_id": order_id, "redirect_url": req.redirect_url or ""}),
|
||||
)
|
||||
|
||||
return CreateInvoiceResponse(
|
||||
invoice_id=invoice["id"],
|
||||
checkout_url=invoice.get("checkoutLink") or invoice.get("checkoutUrl", ""),
|
||||
amount=plan["price"],
|
||||
currency=plan["currency"],
|
||||
plan_type=req.plan_type,
|
||||
)
|
||||
|
||||
|
||||
@app.get("/api/v1/billing/invoices")
|
||||
async def list_invoices(x_user_id: Optional[str] = Header(None)):
|
||||
if not x_user_id:
|
||||
raise HTTPException(status_code=401, detail="Missing X-User-ID header")
|
||||
|
||||
async with pool.acquire() as conn:
|
||||
rows = await conn.fetch(
|
||||
"""
|
||||
SELECT
|
||||
btcpay_invoice_id AS invoice_id,
|
||||
plan_type,
|
||||
amount,
|
||||
currency,
|
||||
status,
|
||||
checkout_url,
|
||||
created_at,
|
||||
settled_at
|
||||
FROM btcpay_invoices
|
||||
WHERE user_id = $1
|
||||
ORDER BY created_at DESC
|
||||
""",
|
||||
x_user_id,
|
||||
)
|
||||
return {"data": [dict(r) for r in rows]}
|
||||
|
||||
|
||||
@app.get("/api/v1/billing/invoices/{invoice_id}")
|
||||
async def get_invoice(invoice_id: str, x_user_id: Optional[str] = Header(None)):
|
||||
if not x_user_id:
|
||||
raise HTTPException(status_code=401, detail="Missing X-User-ID header")
|
||||
|
||||
async with pool.acquire() as conn:
|
||||
row = await conn.fetchrow(
|
||||
"SELECT * FROM btcpay_invoices WHERE btcpay_invoice_id = $1 AND user_id = $2",
|
||||
invoice_id,
|
||||
x_user_id,
|
||||
)
|
||||
if not row:
|
||||
raise HTTPException(status_code=404, detail="Invoice not found")
|
||||
|
||||
# Optionally sync with BTCPay
|
||||
try:
|
||||
remote = await btcpay.get_invoice(invoice_id)
|
||||
row_status = remote.get("status", row["status"])
|
||||
if row_status != row["status"]:
|
||||
await handle_invoice_webhook(invoice_id, row_status)
|
||||
except Exception as e:
|
||||
logger.warning("Could not sync invoice %s with BTCPay: %s", invoice_id, e)
|
||||
|
||||
return dict(row)
|
||||
|
||||
|
||||
@app.post("/api/v1/billing/webhooks/btcpay")
|
||||
async def btcpay_webhook(request: Request):
|
||||
body = await request.body()
|
||||
signature = request.headers.get("BTCPay-Sig", "")
|
||||
|
||||
if not btcpay.verify_webhook(body, signature):
|
||||
logger.warning("BTCPay webhook signature verification failed")
|
||||
raise HTTPException(status_code=401, detail="Invalid signature")
|
||||
|
||||
try:
|
||||
payload = json.loads(body)
|
||||
except json.JSONDecodeError:
|
||||
raise HTTPException(status_code=400, detail="Invalid JSON")
|
||||
|
||||
event_type = payload.get("type", "")
|
||||
invoice_id = payload.get("invoiceId")
|
||||
metadata = payload.get("metadata", {})
|
||||
|
||||
logger.info("BTCPay webhook: type=%s invoice=%s", event_type, invoice_id)
|
||||
|
||||
if event_type.startswith("Invoice") and invoice_id:
|
||||
# For detailed events we may still need to query BTCPay for the exact status,
|
||||
# but BTCPay v2 webhooks usually include enough info in payload["status"]
|
||||
status = payload.get("status", "")
|
||||
if not status and event_type == "InvoiceSettled":
|
||||
status = "Settled"
|
||||
elif not status and event_type == "InvoiceExpired":
|
||||
status = "Expired"
|
||||
elif not status and event_type == "InvoiceInvalid":
|
||||
status = "Invalid"
|
||||
await handle_invoice_webhook(invoice_id, status)
|
||||
|
||||
return {"status": "ok"}
|
||||
28
backend/billing/src/models.py
Normal file
28
backend/billing/src/models.py
Normal file
@@ -0,0 +1,28 @@
|
||||
from typing import Optional
|
||||
from pydantic import BaseModel
|
||||
|
||||
|
||||
class CreateInvoiceRequest(BaseModel):
|
||||
plan_type: str
|
||||
redirect_url: Optional[str] = None
|
||||
|
||||
|
||||
class CreateInvoiceResponse(BaseModel):
|
||||
invoice_id: str
|
||||
checkout_url: str
|
||||
amount: float
|
||||
currency: str
|
||||
plan_type: str
|
||||
|
||||
|
||||
class WebhookPayload(BaseModel):
|
||||
# BTCPay webhook payload is flexible; we only validate the parts we need
|
||||
deliveryId: Optional[str] = None
|
||||
webhookId: Optional[str] = None
|
||||
originalDeliveryId: Optional[str] = None
|
||||
isRedelivery: bool = False
|
||||
type: str
|
||||
timestamp: int
|
||||
storeId: Optional[str] = None
|
||||
invoiceId: Optional[str] = None
|
||||
metadata: Optional[dict] = None
|
||||
129
backend/docker-compose.prod.yml
Normal file
129
backend/docker-compose.prod.yml
Normal file
@@ -0,0 +1,129 @@
|
||||
services:
|
||||
timescaledb:
|
||||
image: timescale/timescaledb:latest-pg15
|
||||
container_name: kosmo_timescaledb
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
POSTGRES_USER: ${POSTGRES_USER:-kosmo}
|
||||
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
|
||||
POSTGRES_DB: ${POSTGRES_DB:-kosmoconnect}
|
||||
ports:
|
||||
- "127.0.0.1:5432:5432"
|
||||
volumes:
|
||||
- timescale_data:/var/lib/postgresql/data
|
||||
- ./migrations:/docker-entrypoint-initdb.d
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-kosmo} -d ${POSTGRES_DB:-kosmoconnect}"]
|
||||
interval: 10s
|
||||
timeout: 5s
|
||||
retries: 5
|
||||
|
||||
redis:
|
||||
image: redis:7-alpine
|
||||
container_name: kosmo_redis
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "127.0.0.1:6379:6379"
|
||||
volumes:
|
||||
- redis_data:/data
|
||||
|
||||
mosquitto:
|
||||
image: eclipse-mosquitto:2
|
||||
container_name: kosmo_mosquitto
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "127.0.0.1:1883:1883"
|
||||
volumes:
|
||||
- ./mosquitto.conf:/mosquitto/config/mosquitto.conf
|
||||
- mosquitto_data:/mosquitto/data
|
||||
|
||||
api:
|
||||
build: .
|
||||
container_name: kosmo_api
|
||||
restart: unless-stopped
|
||||
command: ["uvicorn", "api.src.main:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||
environment:
|
||||
DATABASE_URL: ${DATABASE_URL}
|
||||
MQTT_HOST: ${MQTT_HOST:-mosquitto}
|
||||
MQTT_PORT: ${MQTT_PORT:-1883}
|
||||
ports:
|
||||
- "127.0.0.1:8002:8000"
|
||||
depends_on:
|
||||
timescaledb:
|
||||
condition: service_healthy
|
||||
mosquitto:
|
||||
condition: service_started
|
||||
|
||||
ingestion:
|
||||
build: .
|
||||
container_name: kosmo_ingestion
|
||||
restart: unless-stopped
|
||||
command: ["uvicorn", "ingestion.src.main:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||
environment:
|
||||
DATABASE_URL: ${DATABASE_URL}
|
||||
MQTT_HOST: ${MQTT_HOST:-mosquitto}
|
||||
MQTT_PORT: ${MQTT_PORT:-1883}
|
||||
MQTT_TOPIC: ${MQTT_TOPIC:-kosmo/ingest/enviro}
|
||||
ports:
|
||||
- "127.0.0.1:8001:8000"
|
||||
depends_on:
|
||||
timescaledb:
|
||||
condition: service_healthy
|
||||
mosquitto:
|
||||
condition: service_started
|
||||
|
||||
gateway:
|
||||
build: .
|
||||
container_name: kosmo_gateway
|
||||
restart: unless-stopped
|
||||
command: ["uvicorn", "gateway.src.main:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||
environment:
|
||||
DATABASE_URL: ${DATABASE_URL}
|
||||
MQTT_HOST: ${MQTT_HOST:-mosquitto}
|
||||
MQTT_PORT: ${MQTT_PORT:-1883}
|
||||
ports:
|
||||
- "127.0.0.1:8003:8000"
|
||||
depends_on:
|
||||
timescaledb:
|
||||
condition: service_healthy
|
||||
mosquitto:
|
||||
condition: service_started
|
||||
|
||||
billing:
|
||||
build: .
|
||||
container_name: kosmo_billing
|
||||
restart: unless-stopped
|
||||
command: ["uvicorn", "billing.src.main:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||
environment:
|
||||
DATABASE_URL: ${DATABASE_URL}
|
||||
BTCPAY_URL: ${BTCPAY_URL}
|
||||
BTCPAY_API_KEY: ${BTCPAY_API_KEY}
|
||||
BTCPAY_STORE_ID: ${BTCPAY_STORE_ID}
|
||||
WEBHOOK_SECRET: ${WEBHOOK_SECRET}
|
||||
ports:
|
||||
- "127.0.0.1:8004:8000"
|
||||
depends_on:
|
||||
timescaledb:
|
||||
condition: service_healthy
|
||||
|
||||
nginx:
|
||||
image: nginx:alpine
|
||||
container_name: kosmo_nginx
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "80:80"
|
||||
- "443:443"
|
||||
volumes:
|
||||
- ../web/dashboard/dist:/usr/share/nginx/html/dashboard:ro
|
||||
- ../web/messaging/dist:/usr/share/nginx/html/messaging:ro
|
||||
- ./nginx.conf:/etc/nginx/conf.d/default.conf:ro
|
||||
- ./certs:/etc/nginx/certs:ro
|
||||
depends_on:
|
||||
- api
|
||||
- gateway
|
||||
- billing
|
||||
|
||||
volumes:
|
||||
timescale_data:
|
||||
redis_data:
|
||||
mosquitto_data:
|
||||
56
backend/docker-compose.yml
Normal file
56
backend/docker-compose.yml
Normal file
@@ -0,0 +1,56 @@
|
||||
services:
|
||||
timescaledb:
|
||||
image: timescale/timescaledb:latest-pg15
|
||||
container_name: kosmo_timescaledb
|
||||
environment:
|
||||
POSTGRES_USER: kosmo
|
||||
POSTGRES_PASSWORD: kosmo_dev_pass
|
||||
POSTGRES_DB: kosmoconnect
|
||||
ports:
|
||||
- "5432:5432"
|
||||
volumes:
|
||||
- timescale_data:/var/lib/postgresql/data
|
||||
- ./migrations:/docker-entrypoint-initdb.d
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "pg_isready -U kosmo -d kosmoconnect"]
|
||||
interval: 5s
|
||||
timeout: 5s
|
||||
retries: 5
|
||||
|
||||
redis:
|
||||
image: redis:7-alpine
|
||||
container_name: kosmo_redis
|
||||
ports:
|
||||
- "6379:6379"
|
||||
volumes:
|
||||
- redis_data:/data
|
||||
|
||||
rabbitmq:
|
||||
image: rabbitmq:3-management-alpine
|
||||
container_name: kosmo_rabbitmq
|
||||
ports:
|
||||
- "5672:5672"
|
||||
- "15672:15672"
|
||||
environment:
|
||||
RABBITMQ_DEFAULT_USER: kosmo
|
||||
RABBITMQ_DEFAULT_PASS: kosmo_dev_pass
|
||||
volumes:
|
||||
- rabbitmq_data:/var/lib/rabbitmq
|
||||
|
||||
mosquitto:
|
||||
image: eclipse-mosquitto:2
|
||||
container_name: kosmo_mosquitto
|
||||
ports:
|
||||
- "1883:1883"
|
||||
- "9001:9001"
|
||||
volumes:
|
||||
- ./mosquitto.conf:/mosquitto/config/mosquitto.conf
|
||||
- mosquitto_data:/mosquitto/data
|
||||
- mosquitto_logs:/mosquitto/log
|
||||
|
||||
volumes:
|
||||
timescale_data:
|
||||
redis_data:
|
||||
rabbitmq_data:
|
||||
mosquitto_data:
|
||||
mosquitto_logs:
|
||||
104
backend/gateway/README.md
Normal file
104
backend/gateway/README.md
Normal file
@@ -0,0 +1,104 @@
|
||||
# KosmoConnect Gateway Service
|
||||
|
||||
The **Gateway Service** handles all web-to-mesh and mesh-to-web messaging. It is the monetization boundary of the network.
|
||||
|
||||
## What It Does
|
||||
|
||||
- **Subscription Enforcement**: Validates that the user has an active subscription and that their plan allows messaging the target node
|
||||
- **Quota Management**: Tracks monthly message usage and rejects requests when limits are exceeded
|
||||
- **Outbound Queue**: Accepts web messages, stores them in PostgreSQL, and publishes them to MQTT for bridge delivery
|
||||
- **Inbound Consumer**: Listens to `kosmo/mesh/inbound` and stores replies, automatically threading them into conversations
|
||||
- **Delivery Tracking**: Message status progresses `pending -> queued -> transmitted -> delivered` (future: bridge ACKs will update to `transmitted`)
|
||||
|
||||
## Endpoints
|
||||
|
||||
| Method | Path | Description |
|
||||
|--------|------|-------------|
|
||||
| POST | `/api/v1/messages` | Send a message to a mesh node |
|
||||
| GET | `/api/v1/messages/conversations` | List all conversations for the user |
|
||||
| GET | `/api/v1/messages/conversations/{node_id}` | Get full message history with a node |
|
||||
| GET | `/api/v1/messages/{message_id}` | Check delivery status of a message |
|
||||
|
||||
## Authentication (v0.1)
|
||||
|
||||
For rapid development, the gateway currently uses a simple `X-User-ID` header to identify the caller. In production this will be replaced with JWT/OAuth2.
|
||||
|
||||
## Billing
|
||||
|
||||
Subscription management is handled by the [Billing Service](../billing/README.md), which integrates with the Church of Kosmo's BTCPay Server at `pay.cqre.net`. The Gateway does not process payments itself; it only reads subscription state from the shared PostgreSQL database.
|
||||
|
||||
## Subscription Scopes
|
||||
|
||||
| Plan | Scope | Quota (example) |
|
||||
|------|-------|-----------------|
|
||||
| `wanderer` | Any node on the mesh | 50/month |
|
||||
| `guardian` | Only whitelisted nodes | 500/month |
|
||||
| `sanctuary` | Any node + API/webhooks | Unlimited |
|
||||
| `free` | Receive only | 0 outbound |
|
||||
|
||||
## Running Locally
|
||||
|
||||
Make sure the backend infrastructure (Postgres, MQTT) is running:
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
Seed test users (only needed once):
|
||||
|
||||
```bash
|
||||
docker-compose exec -T timescaledb psql -U kosmo -d kosmoconnect < migrations/002_seed_test_users.sql
|
||||
```
|
||||
|
||||
Start the gateway:
|
||||
|
||||
```bash
|
||||
./run-dev.sh gateway
|
||||
```
|
||||
|
||||
## Testing with cURL
|
||||
|
||||
```bash
|
||||
# Send a message (test wanderer user)
|
||||
curl -X POST http://localhost:8003/api/v1/messages \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-User-ID: 11111111-1111-1111-1111-111111111111" \
|
||||
-d '{"target_node_id": "!a1b2c3d4", "text": "Hello mesh"}'
|
||||
|
||||
# List conversations
|
||||
curl http://localhost:8003/api/v1/messages/conversations \
|
||||
-H "X-User-ID: 11111111-1111-1111-1111-111111111111"
|
||||
|
||||
# Check message status
|
||||
curl http://localhost:8003/api/v1/messages/{message_id} \
|
||||
-H "X-User-ID: 11111111-1111-1111-1111-111111111111"
|
||||
```
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
Web Client
|
||||
|
|
||||
| POST /api/v1/messages (X-User-ID)
|
||||
v
|
||||
Gateway Service (:8003)
|
||||
|- Checks subscription + quota in PostgreSQL
|
||||
|- Writes message to mesh_messages (status=pending)
|
||||
|- Background worker publishes pending rows to MQTT
|
||||
|
|
||||
v
|
||||
MQTT Broker (kosmo/mesh/outbound/{node_id})
|
||||
|
|
||||
v
|
||||
Bridge Daemon (Pi) -> Meshtastic Mesh -> Target Node
|
||||
|
||||
Reply path:
|
||||
Target Node -> Mesh -> Bridge Daemon -> MQTT (kosmo/mesh/inbound)
|
||||
|
|
||||
v
|
||||
Gateway Service consumes MQTT and writes reply to mesh_messages
|
||||
|
|
||||
v
|
||||
Web Client reads via GET /api/v1/messages/conversations
|
||||
```
|
||||
0
backend/gateway/src/__init__.py
Normal file
0
backend/gateway/src/__init__.py
Normal file
362
backend/gateway/src/main.py
Normal file
362
backend/gateway/src/main.py
Normal file
@@ -0,0 +1,362 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
KosmoConnect Gateway Service
|
||||
|
||||
Handles web-to-mesh messaging:
|
||||
- Accepts outbound messages from web clients
|
||||
- Validates subscriptions and quotas
|
||||
- Publishes to MQTT for bridge delivery
|
||||
- Consumes inbound mesh messages and stores them as replies
|
||||
- Tracks delivery status
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import sys
|
||||
import uuid
|
||||
from contextlib import asynccontextmanager
|
||||
from datetime import datetime, timezone
|
||||
from typing import Optional
|
||||
|
||||
import aiomqtt
|
||||
from fastapi import FastAPI, Header, HTTPException
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), "../../shared"))
|
||||
|
||||
from db import get_pool
|
||||
from gateway.src.models import SendMessageRequest, SendMessageResponse
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger("gateway")
|
||||
|
||||
MQTT_HOST = os.getenv("MQTT_HOST", "localhost")
|
||||
MQTT_PORT = int(os.getenv("MQTT_PORT", "1883"))
|
||||
MQTT_TOPIC_INBOUND = os.getenv("MQTT_TOPIC_INBOUND", "kosmo/mesh/inbound")
|
||||
MQTT_TOPIC_OUTBOUND_PREFIX = os.getenv("MQTT_TOPIC_OUTBOUND_PREFIX", "kosmo/mesh/outbound")
|
||||
|
||||
pool = None
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Subscription / Quota Enforcement
|
||||
# ============================================================
|
||||
async def get_active_subscription(user_id: str):
|
||||
async with pool.acquire() as conn:
|
||||
row = await conn.fetchrow(
|
||||
"""
|
||||
SELECT id, plan_type, message_quota, messages_used, valid_until
|
||||
FROM subscriptions
|
||||
WHERE user_id = $1 AND is_active = true
|
||||
AND valid_from <= NOW() AND (valid_until IS NULL OR valid_until > NOW())
|
||||
ORDER BY valid_from DESC
|
||||
LIMIT 1
|
||||
""",
|
||||
user_id,
|
||||
)
|
||||
return row
|
||||
|
||||
|
||||
async def can_send_to_node(user_id: str, target_node_id: str) -> bool:
|
||||
sub = await get_active_subscription(user_id)
|
||||
if not sub:
|
||||
return False
|
||||
|
||||
if sub["plan_type"] in ("wanderer", "sanctuary"):
|
||||
return True
|
||||
|
||||
if sub["plan_type"] == "guardian":
|
||||
async with pool.acquire() as conn:
|
||||
row = await conn.fetchrow(
|
||||
"SELECT 1 FROM allowed_nodes WHERE user_id = $1 AND mesh_node_id = $2",
|
||||
user_id,
|
||||
target_node_id,
|
||||
)
|
||||
return row is not None
|
||||
|
||||
# free plan cannot send outbound
|
||||
return False
|
||||
|
||||
|
||||
async def check_and_increment_quota(user_id: str) -> bool:
|
||||
sub = await get_active_subscription(user_id)
|
||||
if not sub:
|
||||
return False
|
||||
if sub["message_quota"] is not None and sub["messages_used"] >= sub["message_quota"]:
|
||||
return False
|
||||
|
||||
async with pool.acquire() as conn:
|
||||
await conn.execute(
|
||||
"UPDATE subscriptions SET messages_used = messages_used + 1 WHERE id = $1",
|
||||
sub["id"],
|
||||
)
|
||||
return True
|
||||
|
||||
|
||||
# ============================================================
|
||||
# MQTT Outbound Worker
|
||||
# ============================================================
|
||||
async def mqtt_outbound_worker():
|
||||
"""Background task: pick up pending messages and publish to MQTT."""
|
||||
logger.info("Starting MQTT outbound worker")
|
||||
while True:
|
||||
try:
|
||||
async with aiomqtt.Client(MQTT_HOST, MQTT_PORT) as client:
|
||||
while True:
|
||||
async with pool.acquire() as conn:
|
||||
rows = await conn.fetch(
|
||||
"""
|
||||
SELECT id, target_node_id, text
|
||||
FROM mesh_messages
|
||||
WHERE direction = 'outbound' AND status = 'pending'
|
||||
ORDER BY created_at ASC
|
||||
LIMIT 50
|
||||
"""
|
||||
)
|
||||
|
||||
for row in rows:
|
||||
topic = f"{MQTT_TOPIC_OUTBOUND_PREFIX}/{row['target_node_id']}"
|
||||
payload = {
|
||||
"message_id": str(row["id"]),
|
||||
"text": row["text"],
|
||||
"created_at": datetime.now(timezone.utc).isoformat(),
|
||||
}
|
||||
try:
|
||||
await client.publish(topic, json.dumps(payload), qos=1)
|
||||
async with pool.acquire() as conn:
|
||||
await conn.execute(
|
||||
"UPDATE mesh_messages SET status = 'queued', updated_at = NOW() WHERE id = $1",
|
||||
row["id"],
|
||||
)
|
||||
logger.info("Published pending message %s to %s", row["id"], topic)
|
||||
except Exception as e:
|
||||
logger.exception("Failed to publish message %s: %s", row["id"], e)
|
||||
|
||||
await asyncio.sleep(2)
|
||||
except aiomqtt.MqttError as e:
|
||||
logger.error("MQTT outbound worker error: %s. Reconnecting in 5s...", e)
|
||||
await asyncio.sleep(5)
|
||||
|
||||
|
||||
# ============================================================
|
||||
# MQTT Inbound Consumer
|
||||
# ============================================================
|
||||
async def mqtt_inbound_consumer():
|
||||
"""Background task: consume mesh->cloud messages and store replies."""
|
||||
logger.info("Starting MQTT inbound consumer")
|
||||
while True:
|
||||
try:
|
||||
async with aiomqtt.Client(MQTT_HOST, MQTT_PORT) as client:
|
||||
await client.subscribe(MQTT_TOPIC_INBOUND)
|
||||
logger.info("Subscribed to %s", MQTT_TOPIC_INBOUND)
|
||||
async for message in client.messages:
|
||||
try:
|
||||
data = json.loads(message.payload.decode())
|
||||
await handle_inbound(data)
|
||||
except Exception as e:
|
||||
logger.exception("Failed to process inbound message: %s", e)
|
||||
except aiomqtt.MqttError as e:
|
||||
logger.error("MQTT inbound consumer error: %s. Reconnecting in 5s...", e)
|
||||
await asyncio.sleep(5)
|
||||
|
||||
|
||||
async def handle_inbound(data: dict):
|
||||
source_node_id = data.get("source_node_id")
|
||||
text = data.get("text", "")
|
||||
|
||||
async with pool.acquire() as conn:
|
||||
# Try to match this inbound message to a user who has previously sent
|
||||
# an outbound message to this node. For v0.1 we attach it to the most
|
||||
# recent sender, or leave user_id NULL if unknown.
|
||||
user_row = await conn.fetchrow(
|
||||
"""
|
||||
SELECT user_id FROM mesh_messages
|
||||
WHERE direction = 'outbound' AND target_node_id = $1 AND user_id IS NOT NULL
|
||||
ORDER BY created_at DESC
|
||||
LIMIT 1
|
||||
""",
|
||||
source_node_id,
|
||||
)
|
||||
user_id = user_row["user_id"] if user_row else None
|
||||
|
||||
await conn.execute(
|
||||
"""
|
||||
INSERT INTO mesh_messages (
|
||||
id, user_id, direction, sender_node_id, gateway_node_id,
|
||||
text, status, hop_count, rssi, snr, created_at, updated_at
|
||||
) VALUES ($1, $2, 'inbound', $3, $4, $5, 'delivered', $6, $7, $8, NOW(), NOW())
|
||||
""",
|
||||
uuid.UUID(data.get("message_id")) if data.get("message_id") else uuid.uuid4(),
|
||||
user_id,
|
||||
source_node_id,
|
||||
data.get("gateway_node_id"),
|
||||
text,
|
||||
data.get("hop_count"),
|
||||
data.get("rssi"),
|
||||
data.get("snr"),
|
||||
)
|
||||
logger.info("Stored inbound message from %s", source_node_id)
|
||||
|
||||
|
||||
# ============================================================
|
||||
# FastAPI App
|
||||
# ============================================================
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
global pool
|
||||
pool = await get_pool()
|
||||
t1 = asyncio.create_task(mqtt_outbound_worker())
|
||||
t2 = asyncio.create_task(mqtt_inbound_consumer())
|
||||
yield
|
||||
t1.cancel()
|
||||
t2.cancel()
|
||||
try:
|
||||
await t1
|
||||
except asyncio.CancelledError:
|
||||
pass
|
||||
try:
|
||||
await t2
|
||||
except asyncio.CancelledError:
|
||||
pass
|
||||
await pool.close()
|
||||
|
||||
|
||||
app = FastAPI(title="KosmoConnect Gateway Service", lifespan=lifespan)
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=["*"],
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
|
||||
@app.get("/health")
|
||||
async def health():
|
||||
return {"status": "ok", "service": "gateway"}
|
||||
|
||||
|
||||
@app.post("/api/v1/messages", status_code=202)
|
||||
async def send_message(req: SendMessageRequest, x_user_id: Optional[str] = Header(None)):
|
||||
if not x_user_id:
|
||||
raise HTTPException(status_code=401, detail="Missing X-User-ID header")
|
||||
|
||||
if not await can_send_to_node(x_user_id, req.target_node_id):
|
||||
raise HTTPException(status_code=403, detail="Subscription does not allow messaging this node")
|
||||
|
||||
if not await check_and_increment_quota(x_user_id):
|
||||
raise HTTPException(status_code=429, detail="Monthly message quota exceeded")
|
||||
|
||||
msg_id = uuid.uuid4()
|
||||
async with pool.acquire() as conn:
|
||||
await conn.execute(
|
||||
"""
|
||||
INSERT INTO mesh_messages (
|
||||
id, user_id, direction, target_node_id, text, status, created_at, updated_at
|
||||
) VALUES ($1, $2, 'outbound', $3, $4, 'pending', NOW(), NOW())
|
||||
""",
|
||||
msg_id,
|
||||
x_user_id,
|
||||
req.target_node_id,
|
||||
req.text,
|
||||
)
|
||||
|
||||
logger.info("Queued message %s from user %s to %s", msg_id, x_user_id, req.target_node_id)
|
||||
return SendMessageResponse(message_id=str(msg_id), status="pending", queued_at=datetime.now(timezone.utc))
|
||||
|
||||
|
||||
@app.get("/api/v1/messages/conversations")
|
||||
async def list_conversations(x_user_id: Optional[str] = Header(None)):
|
||||
if not x_user_id:
|
||||
raise HTTPException(status_code=401, detail="Missing X-User-ID header")
|
||||
|
||||
async with pool.acquire() as conn:
|
||||
rows = await conn.fetch(
|
||||
"""
|
||||
WITH user_msgs AS (
|
||||
SELECT
|
||||
CASE
|
||||
WHEN direction = 'outbound' THEN target_node_id
|
||||
ELSE sender_node_id
|
||||
END AS node_id,
|
||||
text,
|
||||
created_at,
|
||||
direction,
|
||||
ROW_NUMBER() OVER (PARTITION BY
|
||||
CASE
|
||||
WHEN direction = 'outbound' THEN target_node_id
|
||||
ELSE sender_node_id
|
||||
END
|
||||
ORDER BY created_at DESC
|
||||
) AS rn
|
||||
FROM mesh_messages
|
||||
WHERE user_id = $1
|
||||
)
|
||||
SELECT
|
||||
m.node_id,
|
||||
m.text AS latest_text,
|
||||
m.created_at AS latest_at,
|
||||
a.nickname,
|
||||
COUNT(*) FILTER (WHERE m.direction = 'inbound')::int AS unread_count
|
||||
FROM user_msgs m
|
||||
LEFT JOIN allowed_nodes a ON a.user_id = $1 AND a.mesh_node_id = m.node_id
|
||||
WHERE m.rn = 1
|
||||
GROUP BY m.node_id, m.text, m.created_at, a.nickname
|
||||
ORDER BY m.created_at DESC
|
||||
""",
|
||||
x_user_id,
|
||||
)
|
||||
|
||||
return {"data": [dict(r) for r in rows]}
|
||||
|
||||
|
||||
@app.get("/api/v1/messages/conversations/{node_id}")
|
||||
async def get_conversation(node_id: str, x_user_id: Optional[str] = Header(None)):
|
||||
if not x_user_id:
|
||||
raise HTTPException(status_code=401, detail="Missing X-User-ID header")
|
||||
|
||||
async with pool.acquire() as conn:
|
||||
rows = await conn.fetch(
|
||||
"""
|
||||
SELECT
|
||||
id::text,
|
||||
direction,
|
||||
sender_node_id,
|
||||
target_node_id,
|
||||
text,
|
||||
status,
|
||||
hop_count,
|
||||
rssi,
|
||||
snr,
|
||||
created_at
|
||||
FROM mesh_messages
|
||||
WHERE user_id = $1 AND (
|
||||
(direction = 'outbound' AND target_node_id = $2)
|
||||
OR
|
||||
(direction = 'inbound' AND sender_node_id = $2)
|
||||
)
|
||||
ORDER BY created_at ASC
|
||||
""",
|
||||
x_user_id,
|
||||
node_id,
|
||||
)
|
||||
|
||||
return {"data": [dict(r) for r in rows]}
|
||||
|
||||
|
||||
@app.get("/api/v1/messages/{message_id}")
|
||||
async def get_message_status(message_id: str, x_user_id: Optional[str] = Header(None)):
|
||||
if not x_user_id:
|
||||
raise HTTPException(status_code=401, detail="Missing X-User-ID header")
|
||||
|
||||
async with pool.acquire() as conn:
|
||||
row = await conn.fetchrow(
|
||||
"SELECT id::text, status, created_at, updated_at FROM mesh_messages WHERE id = $1 AND user_id = $2",
|
||||
message_id,
|
||||
x_user_id,
|
||||
)
|
||||
if not row:
|
||||
raise HTTPException(status_code=404, detail="Message not found")
|
||||
return dict(row)
|
||||
35
backend/gateway/src/models.py
Normal file
35
backend/gateway/src/models.py
Normal file
@@ -0,0 +1,35 @@
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class SendMessageRequest(BaseModel):
|
||||
target_node_id: str
|
||||
text: str = Field(..., max_length=200)
|
||||
|
||||
|
||||
class SendMessageResponse(BaseModel):
|
||||
message_id: str
|
||||
status: str
|
||||
queued_at: datetime
|
||||
|
||||
|
||||
class MessageOut(BaseModel):
|
||||
id: str
|
||||
direction: str
|
||||
sender_node_id: Optional[str] = None
|
||||
target_node_id: Optional[str] = None
|
||||
text: Optional[str] = None
|
||||
status: Optional[str] = None
|
||||
hop_count: Optional[int] = None
|
||||
rssi: Optional[int] = None
|
||||
snr: Optional[float] = None
|
||||
created_at: datetime
|
||||
|
||||
|
||||
class ConversationSummary(BaseModel):
|
||||
node_id: str
|
||||
nickname: Optional[str] = None
|
||||
latest_text: Optional[str] = None
|
||||
latest_at: Optional[datetime] = None
|
||||
unread_count: int = 0
|
||||
116
backend/ingestion/src/main.py
Normal file
116
backend/ingestion/src/main.py
Normal file
@@ -0,0 +1,116 @@
|
||||
import asyncio
|
||||
import json
|
||||
import logging
|
||||
from contextlib import asynccontextmanager
|
||||
from datetime import datetime, timezone
|
||||
|
||||
import aiomqtt
|
||||
from fastapi import FastAPI
|
||||
|
||||
import sys
|
||||
import os
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), "../../shared"))
|
||||
|
||||
from db import get_pool
|
||||
from models import IngestEnviroPayload, EnviroReading
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger("ingestion")
|
||||
|
||||
MQTT_HOST = os.getenv("MQTT_HOST", "localhost")
|
||||
MQTT_PORT = int(os.getenv("MQTT_PORT", "1883"))
|
||||
MQTT_TOPIC = os.getenv("MQTT_TOPIC", "kosmo/ingest/enviro")
|
||||
|
||||
pool = None
|
||||
|
||||
|
||||
async def handle_enviro(payload: IngestEnviroPayload):
|
||||
global pool
|
||||
if pool is None:
|
||||
return
|
||||
|
||||
reading = payload.payload
|
||||
async with pool.acquire() as conn:
|
||||
# Ensure node exists (upsert minimal record)
|
||||
await conn.execute(
|
||||
"""
|
||||
INSERT INTO nodes (mesh_node_id, name, lat, lon, last_seen, is_active)
|
||||
VALUES ($1, $2, $3, $4, $5, true)
|
||||
ON CONFLICT (mesh_node_id) DO UPDATE
|
||||
SET last_seen = EXCLUDED.last_seen,
|
||||
lat = COALESCE(EXCLUDED.lat, nodes.lat),
|
||||
lon = COALESCE(EXCLUDED.lon, nodes.lon)
|
||||
""",
|
||||
payload.node_id,
|
||||
payload.node_id,
|
||||
payload.lat,
|
||||
payload.lon,
|
||||
payload.received_at,
|
||||
)
|
||||
|
||||
# Insert reading
|
||||
await conn.execute(
|
||||
"""
|
||||
INSERT INTO enviro_readings (
|
||||
time, node_id, temperature_c, humidity_percent, pressure_pa,
|
||||
wind_speed_ms, wind_direction, pm25_ugm3, pm10_ugm3,
|
||||
gas_resistance_kohm, battery_voltage, solar_voltage
|
||||
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12)
|
||||
""",
|
||||
reading.time or payload.received_at,
|
||||
payload.node_id,
|
||||
reading.temperature_c,
|
||||
reading.humidity_percent,
|
||||
reading.pressure_pa,
|
||||
reading.wind_speed_ms,
|
||||
reading.wind_direction,
|
||||
reading.pm25_ugm3,
|
||||
reading.pm10_ugm3,
|
||||
reading.gas_resistance_kohm,
|
||||
reading.battery_voltage,
|
||||
reading.solar_voltage,
|
||||
)
|
||||
logger.info("Ingested reading for node %s", payload.node_id)
|
||||
|
||||
|
||||
async def mqtt_consumer():
|
||||
global pool
|
||||
pool = await get_pool()
|
||||
logger.info("Connecting to MQTT at %s:%s", MQTT_HOST, MQTT_PORT)
|
||||
|
||||
while True:
|
||||
try:
|
||||
async with aiomqtt.Client(MQTT_HOST, MQTT_PORT) as client:
|
||||
await client.subscribe(MQTT_TOPIC)
|
||||
logger.info("Subscribed to %s", MQTT_TOPIC)
|
||||
async for message in client.messages:
|
||||
try:
|
||||
data = json.loads(message.payload.decode())
|
||||
payload = IngestEnviroPayload(**data)
|
||||
await handle_enviro(payload)
|
||||
except Exception as e:
|
||||
logger.exception("Failed to process message: %s", e)
|
||||
except aiomqtt.MqttError as e:
|
||||
logger.error("MQTT error: %s. Reconnecting in 5s...", e)
|
||||
await asyncio.sleep(5)
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
task = asyncio.create_task(mqtt_consumer())
|
||||
yield
|
||||
task.cancel()
|
||||
try:
|
||||
await task
|
||||
except asyncio.CancelledError:
|
||||
pass
|
||||
if pool:
|
||||
await pool.close()
|
||||
|
||||
|
||||
app = FastAPI(title="KosmoConnect Ingestion Service", lifespan=lifespan)
|
||||
|
||||
|
||||
@app.get("/health")
|
||||
async def health():
|
||||
return {"status": "ok", "service": "ingestion"}
|
||||
102
backend/migrations/001_initial_schema.sql
Normal file
102
backend/migrations/001_initial_schema.sql
Normal file
@@ -0,0 +1,102 @@
|
||||
-- KosmoConnect Initial Schema
|
||||
-- Runs automatically when the TimescaleDB container starts for the first time
|
||||
|
||||
-- Enable TimescaleDB extension
|
||||
CREATE EXTENSION IF NOT EXISTS timescaledb;
|
||||
|
||||
-- ============================================================
|
||||
-- Nodes Registry
|
||||
-- ============================================================
|
||||
CREATE TABLE IF NOT EXISTS nodes (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
mesh_node_id TEXT UNIQUE NOT NULL,
|
||||
name TEXT,
|
||||
lat DOUBLE PRECISION,
|
||||
lon DOUBLE PRECISION,
|
||||
hardware_revision TEXT DEFAULT 'v1.0',
|
||||
installed_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
last_seen TIMESTAMPTZ,
|
||||
is_active BOOLEAN DEFAULT true,
|
||||
metadata JSONB DEFAULT '{}'
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_nodes_mesh_node_id ON nodes(mesh_node_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_nodes_last_seen ON nodes(last_seen);
|
||||
|
||||
-- ============================================================
|
||||
-- Environmental Readings (Time-series)
|
||||
-- ============================================================
|
||||
CREATE TABLE IF NOT EXISTS enviro_readings (
|
||||
time TIMESTAMPTZ NOT NULL,
|
||||
node_id TEXT NOT NULL REFERENCES nodes(mesh_node_id) ON DELETE CASCADE,
|
||||
temperature_c DOUBLE PRECISION,
|
||||
humidity_percent DOUBLE PRECISION,
|
||||
pressure_pa DOUBLE PRECISION,
|
||||
wind_speed_ms DOUBLE PRECISION,
|
||||
wind_direction SMALLINT,
|
||||
pm25_ugm3 DOUBLE PRECISION,
|
||||
pm10_ugm3 DOUBLE PRECISION,
|
||||
gas_resistance_kohm DOUBLE PRECISION,
|
||||
battery_voltage DOUBLE PRECISION,
|
||||
solar_voltage DOUBLE PRECISION
|
||||
);
|
||||
|
||||
-- Convert to hypertable for automatic time-based partitioning
|
||||
SELECT create_hypertable('enviro_readings', 'time', if_not_exists => TRUE);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_enviro_node_id_time ON enviro_readings(node_id, time DESC);
|
||||
|
||||
-- ============================================================
|
||||
-- Mesh Messages (for gateway delivery tracking)
|
||||
-- ============================================================
|
||||
CREATE TABLE IF NOT EXISTS mesh_messages (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
user_id UUID REFERENCES users(id) ON DELETE SET NULL,
|
||||
direction TEXT NOT NULL CHECK (direction IN ('inbound', 'outbound')),
|
||||
sender_node_id TEXT,
|
||||
target_node_id TEXT,
|
||||
gateway_node_id TEXT,
|
||||
text TEXT,
|
||||
status TEXT DEFAULT 'pending' CHECK (status IN ('pending', 'queued', 'transmitted', 'delivered', 'failed')),
|
||||
hop_count INTEGER,
|
||||
rssi INTEGER,
|
||||
snr DOUBLE PRECISION,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_mesh_messages_status ON mesh_messages(status);
|
||||
CREATE INDEX IF NOT EXISTS idx_mesh_messages_target ON mesh_messages(target_node_id, created_at DESC);
|
||||
|
||||
-- ============================================================
|
||||
-- Users & Subscriptions (minimal schema for Phase 1/2)
|
||||
-- ============================================================
|
||||
CREATE TABLE IF NOT EXISTS users (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
email TEXT UNIQUE NOT NULL,
|
||||
stripe_customer_id TEXT,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS subscriptions (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
|
||||
plan_type TEXT NOT NULL CHECK (plan_type IN ('free', 'wanderer', 'guardian', 'sanctuary')),
|
||||
stripe_subscription_id TEXT,
|
||||
message_quota INTEGER,
|
||||
messages_used INTEGER DEFAULT 0,
|
||||
valid_from TIMESTAMPTZ DEFAULT NOW(),
|
||||
valid_until TIMESTAMPTZ,
|
||||
is_active BOOLEAN DEFAULT true
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_subscriptions_user ON subscriptions(user_id);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS allowed_nodes (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
|
||||
mesh_node_id TEXT NOT NULL,
|
||||
nickname TEXT,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
UNIQUE(user_id, mesh_node_id)
|
||||
);
|
||||
20
backend/migrations/002_seed_test_users.sql
Normal file
20
backend/migrations/002_seed_test_users.sql
Normal file
@@ -0,0 +1,20 @@
|
||||
-- Seed test users and subscriptions for local gateway development
|
||||
-- Run manually when setting up a fresh dev environment
|
||||
|
||||
INSERT INTO users (id, email)
|
||||
VALUES
|
||||
('11111111-1111-1111-1111-111111111111', 'test@kosmoconnect.local'),
|
||||
('22222222-2222-2222-2222-222222222222', 'guardian@kosmoconnect.local')
|
||||
ON CONFLICT (id) DO NOTHING;
|
||||
|
||||
INSERT INTO subscriptions (id, user_id, plan_type, message_quota, valid_from, valid_until, is_active)
|
||||
VALUES
|
||||
('aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa', '11111111-1111-1111-1111-111111111111', 'wanderer', 50, NOW(), NOW() + INTERVAL '1 year', true),
|
||||
('bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb', '22222222-2222-2222-2222-222222222222', 'guardian', 500, NOW(), NOW() + INTERVAL '1 year', true)
|
||||
ON CONFLICT DO NOTHING;
|
||||
|
||||
INSERT INTO allowed_nodes (id, user_id, mesh_node_id, nickname)
|
||||
VALUES
|
||||
('cccccccc-cccc-cccc-cccc-cccccccccccc', '22222222-2222-2222-2222-222222222222', '!a1b2c3d4', 'Base Camp'),
|
||||
('dddddddd-dddd-dddd-dddd-dddddddddddd', '22222222-2222-2222-2222-222222222222', '!b2c3d4e5', 'Lookout')
|
||||
ON CONFLICT DO NOTHING;
|
||||
24
backend/migrations/003_btcpay_billing.sql
Normal file
24
backend/migrations/003_btcpay_billing.sql
Normal file
@@ -0,0 +1,24 @@
|
||||
-- BTCPay Server Billing Schema
|
||||
|
||||
CREATE TABLE IF NOT EXISTS btcpay_invoices (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
|
||||
btcpay_invoice_id TEXT UNIQUE,
|
||||
store_id TEXT NOT NULL,
|
||||
plan_type TEXT NOT NULL CHECK (plan_type IN ('free', 'wanderer', 'guardian', 'sanctuary')),
|
||||
amount DECIMAL(16, 8) NOT NULL,
|
||||
currency TEXT NOT NULL DEFAULT 'USD',
|
||||
status TEXT DEFAULT 'Pending' CHECK (status IN ('Pending', 'Processing', 'Settled', 'Invalid', 'Expired')),
|
||||
checkout_url TEXT,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
settled_at TIMESTAMPTZ,
|
||||
metadata JSONB DEFAULT '{}'
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_btcpay_invoices_user ON btcpay_invoices(user_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_btcpay_invoices_btcpay_id ON btcpay_invoices(btcpay_invoice_id);
|
||||
|
||||
-- Add btcpay metadata to subscriptions for traceability
|
||||
ALTER TABLE subscriptions
|
||||
ADD COLUMN IF NOT EXISTS btcpay_invoice_id TEXT,
|
||||
ADD COLUMN IF NOT EXISTS payment_method TEXT;
|
||||
5
backend/mosquitto.conf
Normal file
5
backend/mosquitto.conf
Normal file
@@ -0,0 +1,5 @@
|
||||
listener 1883
|
||||
allow_anonymous true
|
||||
persistence true
|
||||
persistence_location /mosquitto/data/
|
||||
log_dest file /mosquitto/log/mosquitto.log
|
||||
49
backend/nginx.conf
Normal file
49
backend/nginx.conf
Normal file
@@ -0,0 +1,49 @@
|
||||
server {
|
||||
listen 80;
|
||||
server_name _;
|
||||
|
||||
# Dashboard
|
||||
location / {
|
||||
root /usr/share/nginx/html/dashboard;
|
||||
try_files $uri $uri/ /index.html;
|
||||
}
|
||||
|
||||
# Messaging client
|
||||
location /messaging {
|
||||
alias /usr/share/nginx/html/messaging;
|
||||
try_files $uri $uri/ /messaging/index.html;
|
||||
}
|
||||
|
||||
# API services proxy
|
||||
location /api/v1/weather/ {
|
||||
proxy_pass http://api:8000/api/v1/weather/;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
}
|
||||
|
||||
location /api/v1/nodes {
|
||||
proxy_pass http://api:8000/api/v1/nodes;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
}
|
||||
|
||||
location /api/v1/messages {
|
||||
proxy_pass http://gateway:8000/api/v1/messages;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
}
|
||||
|
||||
location /api/v1/billing/ {
|
||||
proxy_pass http://billing:8000/api/v1/billing/;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
}
|
||||
}
|
||||
20
backend/requirements.txt
Normal file
20
backend/requirements.txt
Normal file
@@ -0,0 +1,20 @@
|
||||
# KosmoConnect Backend Dependencies
|
||||
fastapi==0.111.0
|
||||
uvicorn[standard]==0.30.0
|
||||
pydantic==2.11.3
|
||||
pydantic-settings==2.8.1
|
||||
|
||||
# Database
|
||||
asyncpg==0.30.0
|
||||
|
||||
# MQTT
|
||||
aiomqtt==2.3.0
|
||||
|
||||
# HTTP client (for inter-service calls, webhooks)
|
||||
httpx==0.27.0
|
||||
|
||||
# Utilities
|
||||
python-dateutil==2.9.0
|
||||
|
||||
# Simulator script
|
||||
paho-mqtt==2.1.0
|
||||
33
backend/run-dev.sh
Executable file
33
backend/run-dev.sh
Executable file
@@ -0,0 +1,33 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# KosmoConnect Backend Dev Runner
|
||||
# Usage: ./run-dev.sh [ingestion|api]
|
||||
|
||||
SERVICE="${1:-}"
|
||||
if [ -z "$SERVICE" ]; then
|
||||
echo "Usage: ./run-dev.sh [ingestion|api]"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
cd "$(dirname "$0")"
|
||||
|
||||
export PYTHONPATH="${PYTHONPATH:-}:$(pwd)/shared"
|
||||
|
||||
if [ "$SERVICE" = "ingestion" ]; then
|
||||
echo "Starting Ingestion Service..."
|
||||
uvicorn ingestion.src.main:app --host 0.0.0.0 --port 8001 --reload
|
||||
elif [ "$SERVICE" = "api" ]; then
|
||||
echo "Starting API Service..."
|
||||
uvicorn api.src.main:app --host 0.0.0.0 --port 8002 --reload
|
||||
elif [ "$SERVICE" = "gateway" ]; then
|
||||
echo "Starting Gateway Service..."
|
||||
uvicorn gateway.src.main:app --host 0.0.0.0 --port 8003 --reload
|
||||
elif [ "$SERVICE" = "billing" ]; then
|
||||
echo "Starting Billing Service..."
|
||||
uvicorn billing.src.main:app --host 0.0.0.0 --port 8004 --reload
|
||||
else
|
||||
echo "Unknown service: $SERVICE"
|
||||
echo "Usage: ./run-dev.sh [ingestion|api]"
|
||||
exit 1
|
||||
fi
|
||||
0
backend/shared/__init__.py
Normal file
0
backend/shared/__init__.py
Normal file
11
backend/shared/db.py
Normal file
11
backend/shared/db.py
Normal file
@@ -0,0 +1,11 @@
|
||||
import os
|
||||
import asyncpg
|
||||
|
||||
DB_DSN = os.getenv(
|
||||
"DATABASE_URL",
|
||||
"postgresql://kosmo:kosmo_dev_pass@localhost:5432/kosmoconnect"
|
||||
)
|
||||
|
||||
|
||||
async def get_pool() -> asyncpg.Pool:
|
||||
return await asyncpg.create_pool(DB_DSN, min_size=2, max_size=10)
|
||||
46
backend/shared/models.py
Normal file
46
backend/shared/models.py
Normal file
@@ -0,0 +1,46 @@
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class EnviroReading(BaseModel):
|
||||
time: datetime
|
||||
node_id: str
|
||||
temperature_c: Optional[float] = None
|
||||
humidity_percent: Optional[float] = None
|
||||
pressure_pa: Optional[float] = None
|
||||
wind_speed_ms: Optional[float] = None
|
||||
wind_direction: Optional[int] = None
|
||||
pm25_ugm3: Optional[float] = None
|
||||
pm10_ugm3: Optional[float] = None
|
||||
gas_resistance_kohm: Optional[float] = None
|
||||
battery_voltage: Optional[float] = None
|
||||
solar_voltage: Optional[float] = None
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class Node(BaseModel):
|
||||
id: Optional[str] = None
|
||||
mesh_node_id: str
|
||||
name: Optional[str] = None
|
||||
lat: Optional[float] = None
|
||||
lon: Optional[float] = None
|
||||
hardware_revision: str = "v1.0"
|
||||
installed_at: Optional[datetime] = None
|
||||
last_seen: Optional[datetime] = None
|
||||
is_active: bool = True
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class IngestEnviroPayload(BaseModel):
|
||||
type: str = Field(default="enviro_reading")
|
||||
node_id: str
|
||||
received_at: datetime
|
||||
hop_count: Optional[int] = None
|
||||
lat: Optional[float] = None
|
||||
lon: Optional[float] = None
|
||||
payload: EnviroReading
|
||||
Reference in New Issue
Block a user