Building a Workflow Engine with AI Agents
OkieDokie has five workflows: content-approval, campaign-lifecycle, brand-onboarding, trigger-workflow, and email-drip. The content-approval workflow is the core of the system. Here is how it works and what building it taught me.
The Agent Structure
The system has four agents. The base-agent handles core reasoning tasks. It reads context, decides what needs to happen next, and routes to other agents or executes directly. The content-agent specializes in writing and revising copy. The research-agent pulls external data and synthesizes it into usable context. The orchestrator coordinates the three, decides sequencing, and handles handoffs.
This structure emerged from a simpler version that had a single agent doing everything. That agent worked for small tasks. For anything involving revision cycles or external data, it became slow and made poor decisions about what to prioritize. Splitting responsibilities made each agent faster and more predictable.
The Bead System
Content lineage tracking was a problem I did not anticipate until I had real content moving through the pipeline.
When a piece of content goes through research, then drafting, then revision, then approval, you need to know which version of the content exists at which stage and which agent produced it. Without that, debugging a bad output means reading through every agent's history to find where it went wrong.
The bead system assigns an identifier to each piece of content at creation. As the content moves through the pipeline, each agent appends its contribution to the bead. The bead is not the content itself. It is the record of what happened to the content and who touched it. If a campaign produces a bad email, I can look at the bead and trace exactly which agent made which decision.
The Workflow SDK
The workflow SDK I used is a beta package. That means the integration was not straightforward.
The SDK requires a withWorkflow() wrapper in next.config. Getting that stable took a few tries. The package expects specific environment variables and has peer dependencies that conflict with React 19's stricter peer dep handling. I used --legacy-peer-deps and documented the requirement.
The five workflows are defined as SDK workflow objects with named steps. Each step can be a direct action or a handoff to an agent. The campaign-lifecycle workflow has the most steps: strategy, content creation, review, scheduling, and execution. Each step has its own acceptance criteria before the workflow advances.
Route Conflicts
The marketing site for OkieDokie lives in a route group called (marketing). The dashboard has its own route. When I added the marketing landing page, there was a redirect conflict between the dashboard route and the root path.
The fix was explicit: define the marketing layout in the route group and handle the redirect logic in middleware rather than at the page level. Once I moved the redirect out of the page, both routes resolved correctly.
What the PRD Approach Required
I built OkieDokie against a PRD with 45 stories across four phases. Building a workflow engine against a PRD means the stories have to specify behavior precisely enough to test.
A story that says "content goes through approval" is not testable. A story that says "the content-approval workflow advances from draft to review when the content-agent marks the draft complete, and notifies the assigned reviewer via Slack" is testable. Writing the PRD at that level of specificity before building forced me to make design decisions early that I would otherwise have deferred to implementation time.
All 45 stories are complete. The Slack integration works. The lead generation routes are live. The email drip sequences run. The PRD was the map. The agent system was how I got there.