Everyone I talk to about AI eventually asks the same thing: “How do you use it to work faster?”
I’ve stopped trying to answer that question. Because it’s the wrong one.
The better question — the one that actually describes what’s happening at my end — is: what does it do when I’m not watching?
The answer is: a lot. And most of it happens at 3am.

What Actually Happens at 3am
There’s a Google Cloud virtual machine I’ve been building for months. It runs on a small Compute Engine instance in GCP’s us-west1 region. During the day I’m in and out of it — deploying code, running optimizations, publishing articles to client sites. But the interesting stuff happens after I close the laptop.
At 3am Pacific time, a cron job fires. It kicks off a content pipeline that pulls from my second brain — a BigQuery database that logs every working session I’ve ever had with Claude — identifies knowledge gaps across a set of websites I manage, writes articles to fill them, optimizes them for search, and publishes them to WordPress. By the time I wake up, there are new posts live on sites I didn’t touch.
The session extractor runs on a different schedule. Every time I finish a Cowork session, a job logs everything that happened — what was built, what was decided, what failed, what’s next — into Notion with a date stamp and status markers. The next session reads that log before doing anything else. Context that would have evaporated gets carried forward. The machine remembers so I don’t have to.
There are 17 scheduled jobs running on that VM right now. SEO scorecards that refresh on the first of the month. Social media batches that fire every three days. A second brain intelligence dashboard that updates itself and surfaces what’s trending in my own knowledge base. An AI receptionist prototype I’m building for a client that processes intake calls through Twilio and logs them to Firestore — all without a human in the loop.

The Morning Routine That Isn’t One
My mornings used to start with a list. Now they start with a report.
The daily briefing in Notion tells me what the overnight runs produced — which articles went live, which pipelines succeeded, which ones hit an error and why, what the status is on every client and project. Red, yellow, green. By the time I’ve had coffee, I know the state of everything without having asked a single question.
The second brain intelligence dashboard is the part that still surprises me. It tracks what topics are heating up across all my knowledge nodes — which subjects are getting more mentions, more connections, more cross-references. On any given morning it might surface that “agentic commerce” has spiked, or that my restoration intelligence cluster has thinned out and needs new content. I didn’t build an alarm system. I built something that tells me what to pay attention to before I know I should be paying attention to it.
The whole thing runs on maybe $40–60/month in GCP compute. The VM is an e2-standard-2. Not a supercomputer. What makes it powerful isn’t the hardware — it’s the fact that it’s always on, always running, and always logged.

The Moment It Clicked
There was a specific moment when I understood what I was building was different from “using AI tools.”
I was running a music generation pipeline — an experiment where Claude was creating and evaluating short audio clips, keeping the ones that met a quality threshold and discarding the rest. At some point during the run, the pipeline stopped. Not because of an error. Because Claude evaluated the output, decided it wasn’t good enough, and called sys.exit(). It halted itself.
I called it the Autonomous Halt. The article about it is on this site if you want the full story. But the feeling in that moment — reading the log and realizing the system had made a judgment call without me — was unlike anything I’d experienced with software before. It wasn’t just automation. It had opinions about its own output.
That’s when the shift happened in how I think about this. The question stopped being “how do I get AI to help me work” and became “how do I build a system that works, and then stay out of its way.”
What This Changes About How I Work
The conventional productivity conversation is about reclaiming time. You delegate tasks to AI, you get hours back, you use those hours to do higher-value things. That’s real and I don’t dismiss it.
But the thing that’s actually happened for me is different. It’s not that I have more hours. It’s that the category of work that requires my presence has gotten much smaller and much clearer.
The 3am shift handles content. It handles monitoring. It handles routine optimization, publishing, reporting, and logging. What’s left for me is judgment — the things that require knowing the client, reading the room, making a call that doesn’t have a clear right answer. Strategy. Relationships. New ideas. The stuff that benefits from a human being actually thinking, not executing.
The SEO portfolio I manage runs at about $168,000/month in tracked search value across 22 domains. That number grew while I slept. Not metaphorically — the articles published at 3am indexed, ranked, and accumulated traffic value while I was nowhere near a keyboard.

What It Takes to Get Here
I want to be honest about something: this didn’t happen overnight and it didn’t happen by accident. The 3am shift is the result of a lot of deliberate architecture decisions, a lot of failed pipelines, a lot of sessions that ended in error logs instead of published articles.
The session extraction system — the one that logs context to Notion so the next session can pick up cold — that took three iterations to get right. The first two versions lost too much context and the logs were too vague to be useful. The third version extracts structured data: what was built, what failed, what was decided, what’s next. That specificity is what makes the loop work.
The cron jobs took longer than they should have to set up properly, mostly because I kept trying to run them from the wrong place. The Cowork VM is too constrained. The knowledge-cluster-vm on GCP is the right home — persistent, always on, with the credentials and tools pre-loaded. Once that decision was made, the automation clicked into place quickly.
The second brain itself — the BigQuery database that everything feeds into — was the foundational investment. Without a structured knowledge store, the 3am pipeline has nothing to pull from. The intelligence is only as good as what’s been logged.
None of that is glamorous. Most of it was debugging. But the result is a system that genuinely works while I’m not working, and that’s a different category of thing than a faster workflow.
Most people ask how I use AI. The better question is what it does when I’m not watching.
The answer, lately, is most of the work.
Leave a Reply