Helena Township Parks · Capital Improvement Inventory

How this runs

A small township, an edge CDN, serverless functions at the edge, a GPU classifier, and a bucket of photos. Below is the whole stack — what each piece does, how the bytes flow, and why this architecture works for a community of about 1,200 people the same way it would work for a few hundred million.

The one-paragraph version

Photos are uploaded from a phone walking the parks, straight to Linode Object Storage via a Spin-based edge function running on Akamai Cloud Functions. A GPU classifier on a Linode Kubernetes cluster (Qwen2.5-VL-7B + CLIP) identifies each feature and rates its condition. A HEIC conversion sidecar in the same cluster handles iPhone photo formats. The resulting catalog — assets, parks, condition, metadata — is materialized as a single JSON snapshot in object storage and served to the committee and the public through a second Spin function, globally, via Akamai's CDN. Every request is logged through Akamai DataStream 2 to a ClickHouse cluster and visualized in Grafana.

Volunteer (phone) walks & uploads Committee / Public browses catalog Akamai staff reference case Akamai CDN TLS termination · caching · path routing · token gate www.helenatownshipparks.com helena-parks-public GET only · public_mode=true serves /v2/public/* and /v1/* reads S3 snapshot in one shot helena-parks (committee) token-gated · edit capable ingest + classify coordinator writes back to S3 snapshot helena-parks-ingest-e3 photos (originals + thumb + web) catalog/assets.json · catalog/parks.json versioned · audit trail LKE (Linode Kubernetes) GPU pod: Qwen2.5-VL-7B + CLIP ViT-L/14 HEIC convert sidecar RTX 4000 Ada · us-ord no token token classify

What you're looking at

The same binary is deployed as two separate Akamai Cloud Functions apps. One (helena-parks-public) is started with public_mode=true and physically refuses any non-GET verb and any /v1/internal/* path — the edit routes simply are not reachable. The other (helena-parks) is the committee-facing app with the full write surface, gated by a shared token at the CDN layer.

Requests land on the Akamai CDN first. If they carry a valid token (query string or cookie), the CDN routes to the committee function. If not, the CDN routes to the public function. The browser never knows which origin it hit — both respond under www.helenatownshipparks.com, same certificate, same everything.

Akamai CDN (Ion)

TLS termination, global caching, token gate, path-based origin override, and Let's Encrypt DV cert reissue. Single property, both apps, one cert.

Akamai Cloud Functions

Fermyon Spin 3.x on WebAssembly (WASI-p1), deployed with spin aka deploy. One Rust binary, two apps, same code — public-mode flag short-circuits edit routes.

Linode Object Storage

S3-compatible bucket on E3 cluster (us-ord-10). Holds the original photos, derivatives (thumb + web + original), and the single-file catalog snapshots. Bucket versioning enabled.

LKE GPU classifier

Qwen2.5-VL-7B-Instruct + OpenCLIP ViT-L/14 on an RTX 4000 Ada node in LKE. vLLM runtime, typed-prompted VLM for classification + condition, CLIP for nearest-neighbor dedup across visits.

HEIC sidecar

A small Python sidecar in the same LKE cluster that converts iPhone HEIC uploads to JPEG before ingest. A deliberate trade: edge WASM can't link libheif, so the heavy lifting runs in LKE behind a private endpoint.

DataStream 2 → ClickHouse → Grafana

Every edge request is logged through Akamai DS2 with 17 standard fields and delivered to a shared ClickHouse cluster, then rendered in a Grafana dashboard alongside every other demo. Real observability from day one.

Why the architecture matters