Skip to content
AI-native delivery platform
Join Waitlist

Synco

Full-stack compliance platform — shipped in one afternoon

133
Files shipped
16K
Lines of code
3
Data sources integrated
~4 hrs
Time to production

The Brief

A client needed Belgian company compliance monitoring across three government data sources (KBO, Staatsblad, NBB). The traditional quote: €5K–15K and 2–6 weeks from an agency. We pointed our Synq crew at it. Seven agents worked in parallel — architect designed the domain model, backend built three data clients (SOAP, scraper, REST) and a reconciliation engine, frontend shipped a real-time dashboard, QA validated every endpoint, DevOps containerized the whole thing. Production-ready. One afternoon.

Full-stack SaaS — not a prototype, not a demo. Running in production.

Delivery Manifest

API
FastAPI with 3 async data clients — SOAP (KBO), web scraper (Staatsblad), REST (NBB). Reconciliation engine cross-references across sources.
Database
PostgreSQL with company, publication, financial filing, and mandate tables. Full migration history.
Frontend
Next.js 15 + React 19 + shadcn/ui. Real-time search, filterable dashboard, responsive.
Export
PDF, Excel, and HTML reports with diff highlighting for compliance changes.
Infra
Docker Compose for local + production. PM2 process management. Automated daily scheduler.
Quality
API tests, type safety end-to-end, self-review loop before every merge.

System Architecture

Presentation
Next.js 15 SSR
Server components, ISR
shadcn/ui
Data tables, search, filters
API Routes
REST endpoints, Pydantic
Export Downloads
PDF / Excel / HTML
Application
Sync Orchestrator
Fan-out to 3 sources
Query Handlers
Filtered reads, pagination
Export Service
Report generation pipeline
Cache Strategy
Per-source TTL, stale-while-revalidate
Domain
Company
Aggregate root
Filing · Mandate · Publication
Entities
Reconciliation Rules
Cross-source validation
Compliance Status
Value object
Infrastructure
PostgreSQL
Persistence, migrations
SOAP / Scraper / REST
3 external adapters
In-Memory Cache
Source-level TTL
PM2 + Docker
Scheduler, containers
Cross-Cutting
Caching
Source-level TTL · Stale-while-revalidate · Cache invalidation on sync
Error Recovery
Per-adapter retry with exponential backoff · Circuit breaker per source
Logging
Structured JSON logs · Request tracing · Sync audit trail
Rate Limiting
Per-source throttle · Respectful scraping · API quota management
outer depends on innerdomain has zero dependencies

Data Flow

1
PM2 SchedulerSync Orchestrator
Trigger daily sync job
2
OrchestratorCacheCache
Check source-level TTL — skip if fresh
3
OrchestratorSOAP Adapter
Fetch KBO company registry data
4
OrchestratorScraper Adapter
Scrape Staatsblad publications
5
OrchestratorREST Adapter
Query NBB financial filings
6
AdaptersReconciliation
Cross-reference 3 sources — flag discrepancies
7
ReconciliationPostgreSQL
Upsert normalized records with audit trail
8
CacheStoreCache
Update source TTL — invalidate stale entries
9
DashboardQuery Handler
User search — cache-first, DB fallback
10
Query HandlerExport Service
Generate PDF / Excel / HTML with diffs
Core pipeline
Cache layer
Standard flow

What this would cost elsewhere

ProviderCostTimeline
Senior Freelance€3K–6K5–10 days
Dev Agency€5K–15K2–6 weeks
In-house Team€15K/mo2–4 weeks
Synq1 afternoon

Stack

FastAPINext.js 15PostgreSQLDockerPM2PythonTypeScript

Ship your next product this fast

Tell us what you need built. Our agent crew will scope it, build it, and ship it — with full audit trail.

Join Waitlist

Join the Waitlist

Synq is launching soon. Drop your email and we'll send you an invite when it's ready.

Join early adopters shipping production software with AI