Yesterday, ClubHub was mostly an idea and an architecture doc.
Today, it has:
-
A working dashboard showing real member and event counts
-
A members list you can search and filter
-
Subscriptions plumbing ready to go
-
A clean, mobile-friendly UI that already feels like a real product
I didn’t hire a team overnight. I leaned hard on two things:
-
A fine-grained, well-structured backlog
-
A small “team” of AI workers managed like junior developers
This post is about how that combination let the project jump from “nice diagrams” to “clickable product” ridiculously fast.
The setup: ClubHub and the lure
Quick reminder: ClubHub is my side project – a low-cost, humane platform for community clubs to manage members, memberships, events and payments without being squeezed by high-fee apps.
It’s also a deliberate “lure” for late-stage startups:
-
Clean Go/App Runner/Postgres architecture
-
Multi-tenant design
-
Paved-road CI/CD concept
-
Strong values around cost, lock-in, and child safety
So the bar is higher than “it compiles”. I want this repo to look like something a small platform team could actually build on.
Step 1 – Turning ideas into a fine-grained backlog
The first unlock was refusing to let the work stay vague.
Instead of “Build membership MVP” as one giant task, I broke it into tiny, independent, testable items, all sitting under:
docs/backlog/epic-A
Things like:
-
A0 – health-check endpoint
-
A1 – get current club endpoint
-
A3 – membership subscription creation endpoint
-
A8 – list payments endpoint
-
A13 – dashboard metrics API
-
A15 – dashboard page implementation
-
A16 – members page search and filter
-
A20 – create member form
-
A24 – membership subscriptions page
-
A28 – mobile-first navigation
-
A29 – empty states
-
A30 – loading and error states
-
A31 – form validation and user feedback
Each item has its own small markdown file describing:
What this piece of behaviour is
The API shape or UI change
Any edge cases or validation rules
How we’ll know it’s done
In other words: a tiny, self-contained user story.
That structure is important because it does two things:
-
It makes the work legible to future humans (and to me, when I come back tired).
-
It gives AI workers a clear, bounded task instead of “build everything”.
Step 2 – Treating AI like a team of junior developers
The second unlock was how I used AI models.
I didn’t ask “write ClubHub for me”.
Instead, I treated AI like a small team of junior devs plus an automation engineer:
-
A “thinking” model for architecture, docs, naming, trade-offs
-
A code-focused model in Cursor / CLI for implementation work on specific files
-
A CI worker focused only on .gitlab-ci.yml and the build pipeline
-
A testing worker focused only on _test.go files and test patterns
Each AI worker gets:
-
A single backlog item document (for example A15 – dashboard page implementation)
-
The relevant code files (for example frontend/src/pages/Dashboard.tsx)
-
Clear instructions: “Modify only what you need to implement this story.”
In practice, the loop looks like this:
-
Pick a backlog item (say, A16 – members page search and filter).
-
Open the doc and the relevant file in Cursor.
-
Ask the AI worker to implement just that behaviour.
-
Run tests, lint, and a quick manual click-through.
-
Move the markdown file from backlog to done.
Because each item is small and precise, the AI rarely goes off the rails. When it does, the damage is limited to one area.
Step 3 – Why the UI jumped ahead so fast
If you look at the current ClubHub UI, it already feels cohesive:
-
A dashboard with cards for members, active subscriptions, upcoming events, and a “recent payments” panel
-
A Members screen with search, role filters, and a neat list of people
-
A Subscriptions screen ready to display member plans and statuses
-
A clean left-hand nav for Dashboard / Members / Subscriptions / Events
That didn’t appear out of nowhere. It grew by stacking small items:
-
One item for the dashboard metrics API
-
One for rendering those metrics as cards
-
One for wiring the members list
-
One for search and filter behaviour
-
One for mobile-first navigation
-
One for empty states
-
One for loading and error states
-
One for form validation
Each of those went through the same pattern:
-
Describe the behaviour in a tiny doc
-
Hand it to an AI worker with the right files
-
Review, tweak, commit
By the time a human would have finished “designing the dashboard page” in Figma, the AI-assisted code already had:
-
A working React page
-
API calls to the real backend
-
Sensible loading/empty/error behaviour
-
A layout that feels good on desktop and phone
Step 4 – AI worker management, not AI magic
The key here isn’t that the AI is magical. It’s that the management pattern is strong:
-
Clear architecture upfront
The Go/App Runner/Postgres/React shape was decided before any heavy coding. That gives the AI a solid frame to work inside.
-
Fine-grained backlog
The AI never has to hold the whole system in its head. It only has to do “this one thing” well.
-
Separation of responsibilities
Different AI “workers” own different concerns: endpoints vs UI vs CI vs tests. That mirrors how real teams organise.
-
Human review
Nothing goes in without at least a quick human sniff test. If it feels off, I correct the course and the next request is better.
The result is that I can genuinely say: “We implemented a decent slice of ClubHub in a weekend,” and point to real screenshots, not sketches.
Step 5 – Quality First
Step 6 – Why this matters beyond a side project
ClubHub is a personal project, but the approach scales.
If you’re in a late-stage startup that’s:
-
Drowning in Jira tickets nobody loves
-
Struggling to align teams on architecture
-
Nervous about the idea of “AI writing our code”
…this pattern gives you a safe, sane way forward.
-
Decide the architecture and guardrails first.
Make it boring and clear: languages, frameworks, deployment target, security expectations.
-
Break the work into tiny, outcome-focused items.
Not “implement dashboard”, but “render metrics X/Y/Z given this API”.
-
Treat AI as junior devs, not magicians.
Each worker gets one story and a clear acceptance description.
-
Keep humans in the review and integration loop.
AI accelerates throughput; people make the real design and ethical calls.
-
Capture the work in docs.
ClubHub’s docs/backlog and docs/ai-workers folders are already a kind of living textbook for how this was built.
Closing thoughts
The most satisfying part of this experiment isn’t the screenshots (though I’m pretty fond of the “Club health” dashboard).
It’s the feeling that:
-
The backlog is under control.
-
The architecture is solid.
-
The AI is genuinely a force multiplier, not a chaos machine.
I’m still the one deciding what ClubHub is, who it serves, and what values it encodes.
But by giving that vision to a fine-grained backlog and a small “team” of AI workers, the gap between “idea” and “working product” shrank dramatically.
And that, to me, is what modern product and platform work should feel like: less grind, more judgement, and visible progress you can actually click on.
No comments:
Post a Comment