Production-style CQRS Catalog service in .NET 10 using PostgreSQL, Kafka, Outbox pattern, and projection worker for scalable read/write separation.
CQRS Catalog (.NET 10, Kafka, PostgreSQL)
A production-style CQRS starter for a Catalog domain with:
- Write side API and domain layers
- Read side API optimized for queries
- Projection worker for event-driven read model updates
- Kafka as event backbone
- PostgreSQL with separate write/read databases
Table Of Contents
- Quick Start (2-minute)
- Overview
- Architecture
- Architecture Decisions
- Repository Tabs
- Tech Stack Summary
- NuGet Libraries Summary
- Getting Started
- Run The Services
- Verify Local Setup
- CI Pipeline
- GitOps With Argo CD (Next Step)
- Current Status
- Roadmap
- Contributing
- Branch Strategy
- Commit Convention
Quick Start (2-minute)
# 1) start infra
docker compose -f infra/docker-compose.yml up -d
# 2) run write api
dotnet run --project src/Catalog.Write.Api
# 3) run read api (new terminal)
dotnet run --project src/Catalog.Read.Api
# 4) run projection worker (new terminal)
dotnet run --project src/Catalog.Projection.WorkerOpen:
- Kafka UI:
http://localhost:8080 - Write API:
http://localhost:5127/weatherforecast - Read API:
http://localhost:5266/weatherforecast
Overview
This repository follows CQRS principles:
- Commands and transactional consistency live on the write side
- Queries and denormalized models live on the read side
- Domain changes are propagated asynchronously through events
- Reliability patterns are prepared using Outbox and Idempotency tables
Architecture
Project Architecture Diagram
Place your attached image at this path:
docs/images/cqrs-architecture.png
Then this will render on GitHub:
Mermaid Reference Diagram
flowchart LR
A["Admin UI"] --> B["Catalog.Write.Api (.NET 10)"]
B --> C[("Write DB - Postgres")]
B --> D[("Outbox Table")]
D --> E["Outbox Publisher (Hosted Service)"]
E --> F[("Kafka Topic: catalog.events")]
F --> G["Projection Worker (.NET 10)"]
H["Web/Mobile"] --> I["Catalog.Read.Api (.NET 10)"]
G --> J[("Read DB - Postgres")]
I --> J
Architecture Decisions
| Decision | Why It Exists | Outcome |
|---|---|---|
| Separate Write and Read Databases | Command and query workloads have different access patterns | Independent optimization and clearer boundaries |
Outbox Table (outbox_messages) |
Prevent lost events between DB commit and broker publish | Reliable event delivery with retry support |
Kafka Topic (catalog.events) |
Decouple producers from consumers | Async scaling and independent evolution of projections |
Projection Worker + Idempotency (processed_events) |
Consumers may receive duplicates or retries | Exactly-once effect on read model updates |
Read Model Denormalization (catalog_product_read) |
Query APIs need fast, simple lookups | Lower latency and simpler query handlers |
| CQRS Layered Projects | Keep API, domain, app, and infrastructure concerns separated | Better maintainability and testability |
Repository Tabs
Use this as a quick navigation map when browsing the GitHub repo.
| Tab | Purpose | Path |
|---|---|---|
| Write API | Command endpoints (create/update flows) | src/Catalog.Write.Api |
| Write Application | Use cases, command handlers, validation | src/Catalog.Write.Application |
| Write Domain | Domain model and business rules | src/Catalog.Write.Domain |
| Write Infrastructure | Persistence/integration for write side | src/Catalog.Write.Infrastructure |
| Read API | Query endpoints | src/Catalog.Read.Api |
| Read Infrastructure | Query persistence/read adapters | src/Catalog.Read.Infrastructure |
| Projection Worker | Kafka consumer + projection updater | src/Catalog.Projection.Worker |
| Shared | Cross-cutting contracts/utilities | src/Catalog.Shared |
| Infra | Docker Compose + SQL bootstrap scripts | infra/ |
| Docs | Learning notes and phase documentation | docs/ |
Tech Stack Summary
| Area | Technology | Version | Source |
|---|---|---|---|
| Framework | .NET | net10.0 |
src/Catalog.Write.Api/Catalog.Write.Api.csproj |
| Web API | ASP.NET Core | 10.0.3 |
src/Catalog.Write.Api/Catalog.Write.Api.csproj |
| API Docs | OpenAPI (Microsoft.AspNetCore.OpenApi) |
10.0.3 |
src/Catalog.Read.Api/Catalog.Read.Api.csproj |
| API Docs UI | Swagger (Swashbuckle.AspNetCore) |
10.1.4 |
src/Catalog.Write.Api/Catalog.Write.Api.csproj |
| Messaging | Apache Kafka (Confluent image) | 7.6.1 |
infra/docker-compose.yml |
| Kafka Coordination | Zookeeper (Confluent image) | 7.6.1 |
infra/docker-compose.yml |
| Kafka UI | Provectus Kafka UI | latest |
infra/docker-compose.yml |
| Database | PostgreSQL | 16 |
infra/docker-compose.yml |
| ORM / Data Access | EF Core | 10.0.3 |
src/Catalog.Write.Infrastructure/Catalog.Write.Infrastructure.csproj |
| ORM Provider | Npgsql EF Core Provider | 10.0.0 |
src/Catalog.Write.Infrastructure/Catalog.Write.Infrastructure.csproj |
| SQL Mapper | Dapper | 2.1.66 |
src/Catalog.Read.Infrastructure/Catalog.Read.Infrastructure.csproj |
| PostgreSQL Driver | Npgsql | 10.0.1 |
src/Catalog.Read.Infrastructure/Catalog.Read.Infrastructure.csproj |
| CQRS Handler Pipeline | MediatR | 14.0.0 |
src/Catalog.Write.Application/Catalog.Write.Application.csproj |
| Validation | FluentValidation | 12.1.1 |
src/Catalog.Write.Application/Catalog.Write.Application.csproj |
| Worker Hosting | Microsoft.Extensions.Hosting | 10.0.3 |
src/Catalog.Projection.Worker/Catalog.Projection.Worker.csproj |
| Kafka Client | Confluent.Kafka | 2.13.0 |
src/Catalog.Projection.Worker/Catalog.Projection.Worker.csproj |
NuGet Libraries Summary
| Package | Version | Used In |
|---|---|---|
Confluent.Kafka |
2.13.0 |
src/Catalog.Projection.Worker/Catalog.Projection.Worker.csproj |
Dapper |
2.1.66 |
src/Catalog.Read.Infrastructure/Catalog.Read.Infrastructure.csproj, src/Catalog.Write.Infrastructure/Catalog.Write.Infrastructure.csproj, src/Catalog.Projection.Worker/Catalog.Projection.Worker.csproj |
FluentValidation |
12.1.1 |
src/Catalog.Write.Application/Catalog.Write.Application.csproj |
MediatR |
14.0.0 |
src/Catalog.Write.Application/Catalog.Write.Application.csproj, src/Catalog.Write.Api/Catalog.Write.Api.csproj |
Microsoft.AspNetCore.OpenApi |
10.0.3 |
src/Catalog.Read.Api/Catalog.Read.Api.csproj, src/Catalog.Write.Api/Catalog.Write.Api.csproj |
Microsoft.EntityFrameworkCore |
10.0.3 |
src/Catalog.Write.Infrastructure/Catalog.Write.Infrastructure.csproj |
Microsoft.EntityFrameworkCore.Design |
10.0.3 |
src/Catalog.Write.Infrastructure/Catalog.Write.Infrastructure.csproj |
Microsoft.Extensions.Hosting |
10.0.3 |
src/Catalog.Projection.Worker/Catalog.Projection.Worker.csproj |
Npgsql |
10.0.1 |
src/Catalog.Read.Infrastructure/Catalog.Read.Infrastructure.csproj, src/Catalog.Projection.Worker/Catalog.Projection.Worker.csproj |
Npgsql.EntityFrameworkCore.PostgreSQL |
10.0.0 |
src/Catalog.Write.Infrastructure/Catalog.Write.Infrastructure.csproj |
Swashbuckle.AspNetCore |
10.1.4 |
src/Catalog.Write.Api/Catalog.Write.Api.csproj |
Getting Started
Prerequisites
- .NET 10 SDK
- Docker Desktop
- Git
Check versions:
dotnet --version
docker --version
git --version1) Start Infrastructure
From repo root:
docker compose -f infra/docker-compose.yml up -dThis starts:
cqrs_postgresonlocalhost:5432cqrs_zookeeperonlocalhost:2181cqrs_kafkaonlocalhost:9092cqrs_kafka_uionlocalhost:8080
2) Create/Verify Kafka Topic
Auto-create is enabled for local development, but you can create explicitly:
docker exec -it cqrs_kafka kafka-topics \
--bootstrap-server localhost:9092 \
--create \
--topic catalog.events \
--partitions 3 \
--replication-factor 1List topics:
docker exec -it cqrs_kafka kafka-topics \
--bootstrap-server localhost:9092 \
--list3) Database Bootstrap
Postgres initialization scripts in infra/sql run automatically on first container initialization:
infra/sql/00-create-dbs.sqlinfra/sql/10-write-schema.sqlinfra/sql/20-read-schema.sql
If volume already exists, re-run by removing the volume and starting again:
docker compose -f infra/docker-compose.yml down -v
docker compose -f infra/docker-compose.yml up -dRun The Services
Open separate terminals from repo root:
Write API
dotnet run --project src/Catalog.Write.ApiDefault local URL (from launch settings): http://localhost:5127
Read API
dotnet run --project src/Catalog.Read.ApiDefault local URL: http://localhost:5266
Projection Worker
dotnet run --project src/Catalog.Projection.WorkerVerify Local Setup
Kafka UI
Open: http://localhost:8080
Expected:
- Cluster:
local - Topic:
catalog.events
PostgreSQL
List databases:
docker exec -it cqrs_postgres psql -U postgres -c "\l"List write-side tables:
docker exec -it cqrs_postgres psql -U postgres -d catalog_write -c "\dt"List read-side tables:
docker exec -it cqrs_postgres psql -U postgres -d catalog_read -c "\dt"CI Pipeline
GitHub Actions workflows are added under .github/workflows:
ci.yml: restore, build, code-style check (dotnet format), tests (if test projects exist), Docker image publish, and Trivy image scanning.codeql.yml: CodeQL static analysis for C# on push, PR, and weekly schedule.dependency-review.yml: dependency vulnerability/license risk checks on pull requests.
Docker Hub Secrets Required
Set these repository secrets in GitHub:
DOCKERHUB_USERNAMEDOCKERHUB_TOKEN
Docker Publish + Scan Strategy (inside CI)
CI publishes all services into a single Docker Hub repository:
docker.io/<DOCKERHUB_USERNAME>/cqrs
Service-specific tags are used in that repository.
Per service (write-api, read-api, projection-worker), CI:
- Builds and pushes the image.
- Runs Trivy image scan (
HIGH,CRITICAL) and uploads SARIF to GitHub Security. - Fails pipeline if scan finds high/critical vulnerabilities.
Tagging Strategy
On main branch push:
<service>-latest<service>-main<service>-sha-<commit>
On master branch push:
<service>-latest<service>-master<service>-sha-<commit>
On release tag push (example v1.4.2):
<service>-v1.4.2<service>-1.4.2<service>-1.4<service>-sha-<commit>
Release Command
git tag v1.0.0
git push origin v1.0.0GitOps With Argo CD (Next Step)
A GitOps-ready guide is added at:
deploy/argocd/README.md
Recommended production flow:
- CI builds and pushes versioned images to Docker Hub.
- Argo CD Image Updater tracks allowed semver tags.
- Argo CD syncs Kubernetes manifests from your GitOps repo.
- Rollback is done by reverting image tag/manifests in Git.
Current Status
This repo is currently in early foundation phase:
- Infrastructure and schema for CQRS are in place
- Solution/project layering is created
- API and worker projects still contain starter template code
- Domain flows, event publishing, and projections are pending implementation
Roadmap
- Implement write-side product commands and validations.
- Persist domain events to
outbox_messages. - Publish outbox events to Kafka (
catalog.events). - Consume events in projection worker and upsert read models.
- Expose query endpoints from
Catalog.Read.Api. - Add tests (unit + integration with local infra).
Contributing
- Create a feature branch from
main. - Keep PRs focused and small enough to review quickly.
- Update
README.mdordocs/when behavior/setup changes. - Include tests for non-trivial domain or integration changes.
- Run build and basic local checks before opening PR.
Recommended local checks:
dotnet restore
dotnet build CqrsCatalog.slnxBranch Strategy
| Branch | Purpose |
|---|---|
main |
Stable integration branch |
feature/<scope>-<short-description> |
New features and enhancements |
fix/<scope>-<short-description> |
Bug fixes |
chore/<scope>-<short-description> |
Tooling, docs, maintenance |
Examples:
feature/catalog-create-product-commandfix/projection-idempotency-checkchore/readme-tech-stack-sync
Commit Convention
Use conventional-style commits:
| Type | Use For |
|---|---|
feat |
New behavior or capability |
fix |
Bug fix |
refactor |
Internal restructuring without behavior change |
test |
Adding or improving tests |
docs |
Documentation only |
chore |
Tooling/build/housekeeping |
Format:
<type>(<scope>): <short summary>
Examples:
feat(write-api): add create product command endpointfix(worker): skip already processed event idsdocs(readme): add architecture decisions table
