Russell Hampton

This entire page is a vibe.

I leveraged Claude Code to build this entire page — concept to code. No typing. Just voice and vibes. In under two hours.

See How I Made This Page →
Scroll
Day in the Life

What one day looks like as an AI-native designer

Morning Today I woke up and opened Claude Plan Mode to think through my workflow, then opened terminal to log into my Claude Code instance, decided to write a quick Markdown file with specific design inspiration for a project, then worked with Claude to scaffold the project. Ate a snack. Wired up Supabase, Mapbox, and some SQL, defined a few design tokens, connected everything to Figma via MCP, tweaked components in both code and Figma, regenerated things about seventeen times too many, watched Claude mess something up and immediately try something else because Claude is relentless. Midday Had some lunch. Pushed my repo to GitHub, deployed it to Vercel to share it out, built custom skills to extend Claude's behavior, spun up agents to research a new space before writing a PRD. Crafted system prompts and user preferences to shape AI tone and output. Ran out of power and had to plug in my computer. Used MCP servers to connect AI directly to live tools and data sources, managed multi-step agentic workflows without losing the thread, debugged AI-generated code without losing my mind. Afternoon Iterated on prompts the way a designer iterates on comps, knew when to let the AI run and when to yank the wheel, wrote a PRD informed by AI research instead of vibes. Kept a living Markdown file because good context is half the work, set up persistent memory so Claude actually remembers what matters, chained tools together so agents could hand off work to each other, used Claude in Chrome to automate things I was tired of clicking through manually, pulled live data into an artifact using the Anthropic API so the UI could actually think, reviewed what the agent did and course-corrected instead of just trusting the output. End of Day Audited the design system for consistency before a single component got built, defined the visual language before anyone touched a screen, made sure the typography, spacing, and color decisions had a reason behind them. Used AI to pressure-test the information architecture before committing to it, gave feedback on designs the way a creative director would — with a point of view and not just a preference. Pushed back on the easy aesthetic choice in favor of the right user experience. Grabbed a coffee and forgot it was decaf. Wrote design principles that the team could actually use as a decision-making filter, used AI to generate multiple creative directions so the first idea was never the only idea, reviewed AI-generated UI with a critical eye and knew exactly what to throw away. Connected design decisions back to business outcomes so stakeholders understood the why. Ran out of power again and definitely should have bought a longer cable. Ran a design critique with AI-generated concepts as the starting point instead of a blank canvas, documented design decisions so the next person on the project wouldn't have to guess, and ended the day with a deployed product, a sharper PRD, a tighter design system, and a longer list of things I know how to do.
I did this knowing today will look nothing like tomorrow.

The Toolkit

AI-native instruments

Every tool in my workflow is chosen for a reason. I build at the bleeding edge so the work stays ahead of the curve.

01
Claude Code
AI Reasoning
02
Antigravity
AI Agent
03
GitHub
Platform
04
Vercel
Infrastructure
05
Lovable
Prototyping
06
Supabase
Database
07
Figma Make
Design + Code
08
Midjourney
Creative AI
09
Runway
Video AI
10
ElevenLabs
Voice AI
11
GPT
AI Reasoning
+
And more
Always exploring
The Process

How it gets made

Screenshots from real AI-native design sessions — the messy, iterative truth.


Voice Interface

Ask me anything

Ready to listen
↑ Back to top