T-SecOps runs a fully local AI stack — no cloud dependency, no SaaS lock-in. Every model, every detection algorithm, and every data pipeline runs in your environment. Here's exactly how it works under the hood.
From raw packet to AI-generated alert — the full flow from network capture to analyst dashboard, with latency targets at each stage.
T-SecOps runs as a Docker Compose stack. Services are grouped into data collection, AI processing, and user interface layers — each independently scalable and replaceable.
suricata.service — EVE-JSON output, custom rules, Sigma rule engine, pfBlockerNG integrationtimescaledb — time-series hypertables for alert logs and metrics with auto-compressionpostgres — relational data, agent configs, compliance evidence store, pgvector extensionredis — Celery broker, task queues, short-lived alert buffers, session statebackend — REST + WebSocket API, Pydantic v2 schemas, async SQLAlchemy ORMworker — 5 autonomous background jobs, queue-based task dispatch, retry logicollama — local LLM runtime, 6 custom Modelfiles, GPU/CPU inference, context streamingml-engine — 5 trained models loaded at startup, inference <15ms per eventfrontend — TypeScript, Tailwind CSS, Recharts, WebSocket live feedfirewall-connector — read-only rule sync, pfBlockerNG log ingestion, UniFi client trackingti-worker — AbuseIPDB, OTX, GreyNoise, Shodan, Feodo, Spamhaus, URLhausxdr-agent — Windows + Linux, Go binary, eBPF on Linux, WMI on WindowsAll models are trained on real network telemetry and run entirely on-premises. No external inference API, no data leaving your environment.
Each model is a system-prompted specialization of qwen2.5:7b (or nomic-embed-text for RAG). They run fully locally via the Ollama runtime — no API keys, no internet required.
# Modelfile: t-secops-analyst FROM qwen2.5:7b PARAMETER temperature 0.1 PARAMETER num_ctx 8192 SYSTEM """ You are a senior SOC analyst for T-SecOps. You analyze network alerts and produce structured threat assessments with MITRE ATT&CK mappings and actionable recommendations. Response format: severity, summary, indicators, actions. """
# Modelfile: t-secops-classifier FROM qwen2.5:7b PARAMETER temperature 0.0 PARAMETER num_ctx 4096 SYSTEM """ You are a classification engine. Given a Suricata alert, output ONLY valid JSON: {"severity":"critical|high|medium|low", "category":"c2|lateral|exfil|recon|exploit|other", "confidence":0.0-1.0,"mitre":"TXXXX.XXX"} No prose. No explanation. """
# Modelfile: t-secops-correlator FROM qwen2.5:7b PARAMETER temperature 0.2 PARAMETER num_ctx 16384 SYSTEM """ You correlate multiple security alerts into unified threat campaigns. Map observed indicators to the MITRE ATT&CK kill chain. Identify source IP, affected assets, and estimated attack stage. Output structured JSON only. """
# Modelfile: t-secops-briefer FROM qwen2.5:7b PARAMETER temperature 0.3 PARAMETER num_ctx 12288 SYSTEM """ You write concise daily security briefings for SMB security teams. Given 24h alert summaries, produce: executive summary (3 sentences), top 5 events with context, recommended actions ranked by priority, and overnight threat trend. Plain English. """
# Modelfile: t-secops-compliance FROM qwen2.5:7b PARAMETER temperature 0.1 PARAMETER num_ctx 8192 SYSTEM """ You are a compliance specialist for NIS2 Directive, NIST CSF 2.0, and CIS Controls v8.1. Analyse security posture gaps, map to framework controls, and produce remediation steps ordered by risk impact. Output JSON with control IDs. """
# Modelfile: t-secops-rag FROM nomic-embed-text # No system prompt — pure embedding model # Output: 768-dimensional dense vectors # Stored in: pgvector (PostgreSQL extension) # Used by: t-secops-analyst RAG lookups # Sync interval: every 30 minutes
Step-by-step data flow with latency targets at each stage and the services responsible.
T-SecOps is designed to handle sensitive network telemetry. These are the architectural decisions that govern how data is handled, transported, and protected.
The FastAPI backend exposes a full REST API with OpenAPI docs at /docs. WebSocket endpoints power real-time alert streaming. All endpoints accept JWT bearer tokens or API keys.
# Analyze alert ID 12345 curl -X POST https://t-secops.local/api/v1/ai/analyze \ -H "Authorization: Bearer $TOKEN" \ -H "Content-Type: application/json" \ -d '{"alert_id": 12345, "model": "analyst"}' # Response { "severity": "high", "summary": "Suspected C2 beaconing to 185.220.101.x", "mitre": "T1071.001", "confidence": 0.87, "actions": ["block_ip", "isolate_host"] }
Docker Compose deployment on Ubuntu, ARM64, or NVIDIA GPU. Full step-by-step guide with hardware requirements, configuration, and first-run verification.