Sunday, December 07, 2025

One Week of ClubHub – From Blank Repo to Working Platform (With AI Workers)

A week ago, ClubHub was just an idea, a few GoodNotes sketches, and a sense that community clubs are being squeezed by high-fee platforms.


Today there is:

  • A working Go backend with a real data model for clubs, members, memberships, events and payments

  • A clean React frontend with a live dashboard, members list and subscriptions views

  • A paved-road GitLab CI design ready to run tests, build images and deploy

  • A growing backlog of fine-grained stories across product, tech, security and AI features

  • An “AI dev team” wired into the project with clear standards and roles

It’s still early, but it already feels like something a real team could build on.

This post is a quick reflection on what happened in one week, and why it felt so much faster and calmer than a normal solo side project.



Clarifying the mission first

Before writing any Go or React, I used AI as a kind of Product Manager and thinking partner to clarify what ClubHub should be:

  • A low-cost, humane platform for community clubs – sports, music, youth, school

  • No ads, no tracking, no lock-in

  • Very small or zero cost for core admin (members, events, payments)

  • Monetisation only on things clubs already profit from: merch, raffles, fundraising, maybe premium features

  • Designed from day one with child safety and GDPR in mind


We also sketched the bigger “shape”:

  • Local clubs attached to national organisations

  • National organisations attached to global federations

  • Safe, identified comms between verified roles in that hierarchy (admins, coaches), never exposing individual children or normal members globally


That gave the whole week a clear direction. Every technical choice had something to point back to.


Architecture and paved road

Core Architecture and CI/CD Paved Road


On the technical side, I deliberately picked a boring, solid architecture:

  • Go modular monolith for the backend

  • Postgres as the main database, multi-tenant by club_id

  • React single-page app (Vite + TypeScript + Tailwind) for the UI

  • Railway / AWS App Runner style deployment – one container per environment, minimal ops

Then I wrote it down:

  • docs/architecture.md – the high-level design and trade-offs

  • docs/paved-road-ci.md – how a GitLab pipeline should look: test, security, build, migrate, deploy, smoke

That “paved road” idea is important. AI can generate code, but I still want everything to flow through a predictable path: tests, security checks, reproducible builds and simple deploys.


Fine-grained backlog instead of vague wishes

The biggest productivity unlock was the backlog.

Instead of “Build MVP” as a vague todo, I broke the work into a pile of tiny, concrete stories under docs/backlog/epic-A:

  • A0 – Health-check endpoint

  • A13 – Dashboard metrics API

  • A15 – Dashboard page implementation

  • A16 – Members page search and filter

  • A24 – Membership subscriptions page

  • A28 – Mobile-first navigation

  • A29 – Empty states

  • A30 – Loading and error states

  • A31 – Form validation and feedback

Each story lives in its own small markdown file with:

  • Summary

  • Problem / context

  • Scope (in and out)

  • Acceptance criteria

Most of them are only a page or two of text, but that structure matters. It gives both me and the AI workers something to hold onto: “this is what done looks like for this one little slice.”

By the end of the week, the backlog had grown into multiple epics, including future work around AI-moderated chat, AI scheduling, carpooling, and global organisations and directories. It already looks like a real product roadmap.


AI workers as junior devs

I didn’t try to get AI to “build ClubHub for me”.

Instead I treated the models as a small team of junior developers and specialists:

  • A “thinking” model to discuss architecture, design, testing strategy and product trade-offs

  • A code-focused model inside Cursor to implement specific stories in Go or React

  • A DevOps worker to design and refine the .gitlab-ci.yml and deployment docs

  • A Test worker to focus on unit tests, integration tests and Playwright e2e

Each worker gets:

  • One story file (A15, A16, etc.)

  • The relevant code files

  • Very explicit instructions: change only what you need to deliver this story and keep to the coding/testing/security standards in the /ai folder

My job becomes:

  • Picking the next 1–3 stories to move forward

  • Framing and reviewing the AI work

  • Making the product and ethical decisions

That’s the pattern that made the week feel more like managing a small team than grinding alone in a side project.


Quality first, even in a fast week

I was careful not to let “speed” become “whatever, we’ll fix it later”.  Frontend, backend and E2E tests assure quality and enable pace. 

The rule of thumb for the week was:

  • Backend work must pass go test ./... and basic linting

  • Frontend work must build cleanly and behave in the browser

  • Key flows (dashboard, members, subs) should have Playwright e2e coverage

  • The Docker image builds and runs locally the same way it will in CI

The /ai folder holds testing and security standards. AI workers are asked to refer to them and add tests wherever new behaviour appears.

The result is not a perfect system, but it is a project that is testable, deployable and understandable after just seven days of focused work.


What exists today

If I zoom out and look at the repo as if I were a hiring manager or a future teammate, here’s what I see after one week:

  • A Go backend with a clear domain model for clubs, members, memberships, events and payments

  • A React frontend with a working dashboard, members page and subscriptions page, all wired into real APIs

  • A set of architecture, CI and project notes that read like internal platform documentation

  • A fine-grained backlog across multiple epics, including the global federation/national/club hierarchy and AI-assisted features

  • An AI worker ecosystem that codifies how models are used, not just “we threw ChatGPT at it”

And importantly: it still feels calm. The backlog is clear, the architecture is boring-in-a-good-way, and the CI story is more than hand-waving.


What’s next

The short-term next steps are:

  • Get a live deployment running on Railway or App Runner

  • Scrub the repo (secrets, sample data, README) and make it public

  • Add a simple “demo flow” so someone can sign in, play with the dashboard, and see a believable club

After that, I want to focus on two directions:

  • Making the core experience great for a handful of real clubs (music, sports, youth)

  • Pushing the org and federation layer further, so that “all chess clubs in the world” could find each other and collaborate safely if they wanted to

The thing I’m most pleased about after one week is not the feature list. It’s that the project already encodes the values I care about:

  • Clubs over platforms

  • Kids over growth hacks

  • AI as a force multiplier with guardrails, not a replacement for human judgement


If that’s what week one looks like, I’m excited to see what week four and week twelve can bring.

No comments:

🐌 From Codex CLI to OpenAI API: Building a Smarter AI Worker in 24 Hours

From Codex CLI to OpenAI API: Building a Smarter AI Worker in 24 Hours How throttling led to a complete rewrite, cost optimization, and a mo...