Module 7 · Capstone · ~30 min setup
Pull together everything from the previous six modules and aim it at a real Canton sandbox. You'll build a Go service that connects, submits commands, streams transaction updates, and packages cleanly as a CLI. Four phases — each phase produces a working artifact you can stop at.
By the end of phase 4 you'll have a single binary cantonctl with subcommands:
cantonctl ping # health-check the sandbox
cantonctl submit --party Alice --payload ... # submit a command
cantonctl stream --party Alice # stream transaction updates
It will speak gRPC to a Canton participant's Ledger API, handle TLS or insecure as configured, propagate context for clean shutdown, and emit structured logs.
| Tool | Why | Install |
|---|---|---|
| Docker | Run the Canton sandbox locally | docs.docker.com/get-docker |
| Go 1.22+ | Build the client | brew install go or go.dev/dl |
| buf | Generate Ledger API stubs from protobufs | brew install bufbuild/buf/buf |
| protoc-gen-go & protoc-gen-go-grpc | Buf delegates to these | go install google.golang.org/protobuf/cmd/protoc-gen-go@latestgo install google.golang.org/grpc/cmd/protoc-gen-go-grpc@latest |
| grpcurl (optional) | Manual gRPC poking; useful when debugging | brew install grpcurl |
Digital Asset publishes a Canton container that runs an in-memory participant + synchronizer + sandbox Daml model. The exact image name and version may shift; check the Canton 3.5 docs for the current canonical command. A typical incantation looks like:
# Adjust image tag to match a real published version
docker run --rm -it \
-p 5011:5011 \ # Ledger API gRPC port
-p 5012:5012 \ # Admin API port
digitalasset/canton-open-source:latest \
daemon --config /examples/01-simple-topology/simple-topology.conf
You'll know it's up when you see logs about "Ledger API server listening on 5011." If the published image structure differs, the docs will spell out the right invocation — the shape (a participant exposing 5011 over gRPC) is what matters.
You can do phases 1–3 against a fake gRPC server using bufconn (see Module 5 exercise). The wire is the same; only the live integration is missing. Phase 4 (CLI packaging) doesn't depend on a live server at all.
Phase 1 walks through cloning the daml repo at the right tag and copying the protos in. Outline:
github.com/digital-asset/daml at a tag matching your sandbox version.ledger-api/grpc-definitions/com/daml/ledger/api/v2/ into your module's proto/ tree.buf.yaml + buf.gen.yaml as in Module 5 Exercise 1.buf generate → Go stubs.go mod tidy → resolve all transitive deps (grpc, protobuf, well-known types, possibly pieces of scalapb or googleapis).
┌──────────────────────────────────────┐
│ cantonctl │
│ ┌──────────────────────────────┐ │
│ │ cmd/cantonctl/main.go │ │
│ │ (cobra commands wire here) │ │
│ └──────────┬───────────────────┘ │
│ │ uses │
│ ┌──────────▼───────────────────┐ │
│ │ internal/ledger/Client │ │
│ │ Connect, Submit, Stream │ │
│ │ (wraps generated stubs) │ │
│ └──────────┬───────────────────┘ │
└──────────────┼──────────────────────┘
│ gRPC over HTTP/2
▼
Canton sandbox
(Ledger API on :5011)
Standard separation: a Client wrapper inside your module exposes a clean Go API; the CLI main binds command-line flags to that API. The wrapper is testable; the CLI is thin.
| Phase | What you build | Done when |
|---|---|---|
| 1 — Connect | Ledger API gRPC connection. Health check / version call. | cantonctl ping prints the sandbox's version. |
| 2 — Submit | Command submission via CommandService.SubmitAndWait. | Submit creates a contract; you get back a non-empty completion offset. |
| 3 — Stream | Transaction stream via UpdateService. Print updates as they arrive. | Run stream in one terminal; submit in another; see the new transaction appear. |
| 4 — CLI | Cobra-based subcommands, flags, structured logging, graceful shutdown. | Single binary, --help shows all subcommands cleanly, Ctrl-C exits cleanly. |
The capstone is intentionally open-ended. Definition of done:
cantonctl stream against a fresh Canton sandbox, run cantonctl submit from another terminal, and see your submission show up in the stream output.go vet ./... and go test -race ./... for any unit tests you write.Bonus, if you have time: add a Prometheus metric for "submissions per second by outcome," and a basic OTel trace span around the gRPC calls.
Each phase has its own subdirectory with a starting skeleton and a README. They're meant to be read in order — phase 1 sets up the module structure that phases 2–4 extend.
For phases 2–4, you'll typically copy your phase-1 directory and extend it, rather than starting from scratch. The phases are stages of one project, not separate projects.