Skip to main content

Motia: Event-Driven API Workflows in 2026

·APIScout Team
motiaevent-drivenapi-designworkflowsrestbackend
Share:

Motia: Event-Driven API Workflows in 2026

TL;DR

Motia ranked #1 backend/full-stack in JS Rising Stars 2025 by solving a problem every API developer knows: production backends aren't just REST endpoints. They're REST endpoints plus queue workers plus cron jobs plus webhook handlers plus workflow state machines. Motia collapses all of them into a single primitive — the Step — and connects them through an event bus baked into the runtime. The result is API design that reads like a directed graph instead of a collection of disconnected services.

Key Takeaways

  • One primitive, five trigger types — HTTP, queue event, cron, state change, and stream all use identical Step structure; only the config.type changes
  • Emit-subscribe replaces controller chaining — Steps emit named events; other Steps subscribe; the runtime wires them without explicit imports or dependency injection
  • Changing execution model = changing one field — switching an API route to a background job requires changing type: 'api' to type: 'event', nothing else
  • Built-in retries, backoffs, and timeouts — declared in config, handled by the runtime; no custom middleware or Bull queue retry logic
  • Visual Workbench ships with every project — live flow graph, event injection, request tracing at localhost:3111
  • Zero-downtime adapter swaps — memory → Redis → SQS/Kafka in iii-config.yaml without touching Step code

The REST API Design Trap

REST APIs get shipped with synchronous, request-response thinking. A POST /orders handler validates the request, writes to the database, sends a confirmation email, and enqueues a fulfillment job — all in the same controller function, all blocking the HTTP response.

This works until it doesn't. The email provider is slow. The fulfillment service is down. The database write succeeds but the email fails and now the order is in an inconsistent state. The solution — offload side effects to a queue — requires wiring BullMQ or Inngest alongside Express, writing separate worker files, handling the queue connection lifecycle, and building observability across two different execution environments.

By the time a production Express or NestJS backend handles this correctly, it looks like this:

src/
├── controllers/
│   └── orders.controller.ts     # HTTP handler
├── services/
│   └── orders.service.ts        # Business logic
├── jobs/
│   └── fulfillment.worker.ts    # BullMQ worker
│   └── email.worker.ts          # Email queue worker
├── queues/
│   └── fulfillment.queue.ts     # Queue definitions
│   └── email.queue.ts
└── cron/
    └── order-expiry.cron.ts     # Scheduled cleanup

Seven files for one business flow. Each with its own import graph, its own error handling pattern, and its own visibility story.

Motia's answer: the entire order flow is five Step files, each identical in structure, wired together by named events.


The Motia Step: One Pattern for All API Operations

Every Motia Step is a single file with two exports — config and handler:

export const config = { /* what triggers this step */ }
export async function handler(input, ctx) { /* what it does */ }

The config.type field determines the trigger. The handler is always the same function signature. There are no controllers, no services, no workers, no queue definitions — just Steps.

HTTP API Step

// src/create-order.step.ts
import { ApiRoute, FlowContext } from 'motia'

export const config: ApiRoute = {
  type: 'api',
  method: 'POST',
  path: '/orders',
  emits: ['order.created'],
  flows: ['order-fulfillment'],
}

export async function handler(
  req: { body: { productId: string; userId: string; quantity: number } },
  { emit, logger }: FlowContext
) {
  const order = {
    id: crypto.randomUUID(),
    ...req.body,
    status: 'pending',
    createdAt: new Date().toISOString(),
  }

  logger.info('Order received', { orderId: order.id })
  await emit('order.created', order)

  return { status: 202, body: { orderId: order.id, status: 'processing' } }
}

The handler validates, creates the order object, emits order.created, and returns immediately with a 202 Accepted. The HTTP response is non-blocking — the order processing happens downstream, in other Steps that subscribe to order.created.

This is the core API design shift: the HTTP layer declares intent by emitting events, not by calling services directly.

Event Step (Background Job)

// src/process-payment.step.ts
import { EventConfig, FlowContext } from 'motia'

export const config: EventConfig = {
  type: 'event',
  subscribes: ['order.created'],
  emits: ['payment.processed', 'payment.failed'],
  flows: ['order-fulfillment'],
}

export async function handler(
  order: { id: string; userId: string; productId: string; quantity: number },
  { emit, logger }: FlowContext
) {
  logger.info('Processing payment', { orderId: order.id })

  try {
    const payment = await stripe.charges.create({ /* ... */ })
    await emit('payment.processed', { orderId: order.id, chargeId: payment.id })
  } catch (error) {
    await emit('payment.failed', { orderId: order.id, error: error.message })
  }
}

This Step runs asynchronously after the HTTP response is sent. Retries, backoffs, and dead-letter handling are configured at the runtime level — this file only contains payment logic.

Notice the config: subscribes: ['order.created'], emits: ['payment.processed', 'payment.failed']. The flow graph is declarative. The Motia Workbench reads these configs and renders the entire order fulfillment DAG visually, without any additional documentation.

Cron Step

// src/expire-orders.step.ts
import { CronConfig, FlowContext } from 'motia'

export const config: CronConfig = {
  type: 'cron',
  cron: '0 * * * *',            // every hour
  emits: ['orders.expiry-check'],
  flows: ['order-maintenance'],
}

export async function handler(_: void, { emit, logger }: FlowContext) {
  logger.info('Running order expiry check')
  await emit('orders.expiry-check', { triggeredAt: new Date().toISOString() })
}

A cron Step emits an event just like an HTTP Step does. The downstream processing Step that handles orders.expiry-check is identical whether triggered by the scheduler or by a manual event injection through the Workbench.

State Step

// src/update-order-status.step.ts
import { EventConfig, FlowContext } from 'motia'

export const config: EventConfig = {
  type: 'event',
  subscribes: ['payment.processed'],
  emits: ['order.confirmed'],
  flows: ['order-fulfillment'],
}

export async function handler(
  { orderId, chargeId }: { orderId: string; chargeId: string },
  { emit, state, logger }: FlowContext
) {
  await state.set(`order:${orderId}`, { status: 'confirmed', chargeId })
  logger.info('Order confirmed', { orderId })
  await emit('order.confirmed', { orderId })
}

The state context is a key-value store backed by whichever adapter you've configured (memory for local dev, Redis for production). Steps can read and write state without importing a separate Redis client — the context injects it.


API Design Patterns in Motia

The Step primitive enables patterns that are awkward or expensive in traditional REST frameworks.

Pattern 1: The Fan-Out API

One HTTP request triggers multiple parallel background processes. In Express, this means enqueuing multiple jobs and managing their completion. In Motia:

// HTTP step emits one event
await emit('user.registered', { userId, email, plan })

// Three independent event steps subscribe to the same event:
// send-welcome-email.step.ts      subscribes: ['user.registered']
// create-workspace.step.ts        subscribes: ['user.registered']
// add-to-crm.step.ts              subscribes: ['user.registered']

All three run in parallel, automatically, without any coordination code in the HTTP handler. Add a fourth step by creating a new file and adding subscribes: ['user.registered'] to its config — zero changes to existing code.

This is the fan-out pattern. It's a foundational building block of event-driven APIs, and in Motia it's the default rather than an architectural decision requiring dedicated tooling.

Pattern 2: The Saga (Long-Running Workflow)

Distributed transactions across services are notoriously difficult. The Saga pattern coordinates a multi-step workflow where each step has a compensating action if it fails. In Motia, sagas map directly to chains of emit/subscribe events:

order.created
  → payment.processed OR payment.failed
      → payment.processed → inventory.reserved OR inventory.failed
          → inventory.reserved → order.confirmed → notification.sent
          → inventory.failed → payment.refunded → order.cancelled
      → payment.failed → order.cancelled → notification.sent

Each node in this graph is a Step file. Each edge is a named event. The Workbench renders this graph live. When a payment fails in production, you open the Workbench, find the failed trace, inspect the event payload at the failure point, inject a test event to reproduce it, and fix the Step file — all without touching the other Steps in the flow.

Pattern 3: The Webhook Ingestion Pipeline

Webhook handlers are a common pain point for REST APIs. Stripe, GitHub, Slack, and every other major platform send HTTP callbacks that need to be validated, acknowledged quickly, and processed asynchronously. In Express, this means a controller that validates the signature, enqueues the event, and returns 200 — then a separate worker that processes the job.

In Motia:

// src/stripe-webhook.step.ts
export const config: ApiRoute = {
  type: 'api',
  method: 'POST',
  path: '/webhooks/stripe',
  emits: ['stripe.payment_intent.succeeded', 'stripe.charge.refunded'],
  flows: ['stripe-processing'],
}

export async function handler(req, { emit, logger }) {
  const event = stripe.webhooks.constructEvent(
    req.rawBody,
    req.headers['stripe-signature'],
    process.env.STRIPE_WEBHOOK_SECRET
  )

  logger.info('Stripe webhook received', { type: event.type })
  await emit(`stripe.${event.type}`, event.data.object)
  return { status: 200, body: { received: true } }
}

The webhook handler validates and routes. Downstream Steps handle each event type independently. Adding a new Stripe event type is one new Step file with the right subscribes config — no changes to the webhook handler.

Pattern 4: The Streaming API Response

For AI and LLM-backed API endpoints, streaming the response token-by-token is expected in 2026. Motia's stream Step type handles this:

// src/generate-summary.step.ts
export const config: ApiRoute = {
  type: 'api',
  method: 'POST',
  path: '/summaries',
  emits: ['summary.requested'],
  flows: ['ai-generation'],
}

// src/stream-llm-response.step.ts
export const config: EventConfig = {
  type: 'event',
  subscribes: ['summary.requested'],
  flows: ['ai-generation'],
}

export async function handler({ content, requestId }, { stream, logger }) {
  const response = await openai.chat.completions.create({
    model: 'gpt-4',
    messages: [{ role: 'user', content: `Summarize: ${content}` }],
    stream: true,
  })

  for await (const chunk of response) {
    const token = chunk.choices[0]?.delta?.content ?? ''
    await stream.write(requestId, token)
  }

  await stream.close(requestId)
}

The HTTP Step returns a stream ID. The client polls or subscribes to the stream endpoint. The LLM tokens flow through the Motia stream adapter — memory for local dev, Redis Pub/Sub or Kafka for production.


Motia vs Traditional REST Frameworks: Architecture Comparison

DimensionExpress / HonoNestJSMotia
HTTP routingManual router.post('/path', handler)Decorators (@Post('/path'))config.type: 'api' in Step file
Background jobsInstall BullMQ, write workers separately@InjectQueue(), separate consumerChange type: 'event', add subscribes
Cron schedulingInstall node-cron, separate file@Cron() decorator, separate moduleconfig.type: 'cron' in Step file
Event busManual EventEmitter or separate Kafka clientEventEmitter2, separate module setupBuilt-in, wired through emit/subscribes
Retry logicCustom middleware or BullMQ job optionsSeparate retry interceptorDeclared in runtime config, automatic
ObservabilityAdd Datadog/Honeycomb manuallyNestjs-pino, separate setupWorkbench ships with every project
Multi-languageNode.js onlyNode.js onlyTypeScript, JavaScript, Python, Ruby (beta)
Local devNodemon + separate queue worker processnest start --watch + separate workermotia dev starts everything
Infrastructure configEnvironment variables + manual adaptersConfiguration module + providersiii-config.yaml adapter swaps

The Key Insight: Execution Model Portability

The most consequential difference between Motia and traditional REST frameworks isn't the syntax — it's what changes when your execution requirements change.

In Express, turning a synchronous controller into a background job means:

  1. Extract the logic into a service
  2. Install and configure BullMQ (or equivalent)
  3. Write a queue definition
  4. Write a worker file
  5. Add retry configuration
  6. Update observability to cover the new execution environment

In Motia, turning a synchronous HTTP Step into a background job means:

  • Change type: 'api' to type: 'event' in the config

The business logic — the handler function — doesn't change at all. The same function that handled an HTTP request can handle a queue message. This portability is what developers mean when they say Motia "makes backend boring" — the framework's job is to eliminate the infrastructure decisions that shouldn't require developer attention.


Event-Driven API Design: When Motia's Model Wins

User onboarding flows — Registration triggers welcome email, workspace creation, CRM sync, trial activation, and analytics event simultaneously. Fan-out at the HTTP boundary keeps the response instant.

Webhook processing at scale — High-volume platforms (Stripe, GitHub, Twilio) send webhooks in bursts. HTTP acknowledgment is immediate; event processing queues and scales independently.

AI/LLM API backends — Streaming token responses through a structured pipeline, with intermediate Steps for context retrieval, prompt construction, and output formatting, is the natural fit for Motia's stream type.

Multi-step document processing — Upload → validate → extract → enrich → store → notify. Each Step is independent, retryable, and testable in isolation.

Payment and fulfillment workflows — The Saga pattern maps directly to Motia's event chains. Compensating actions (refunds, cancellations) are just Steps that subscribe to failure events.

When Traditional REST Frameworks Win

Motia isn't the right tool for every API:

  • Simple CRUD APIs — If your API is thin reads and writes with no background processing, Express or Hono with a single database connection is lighter and simpler
  • Teams deep in NestJS — The decorator-based module system has real advantages for large teams enforcing architectural standards; Motia's config-based approach is less structured
  • Existing BullMQ/Temporal investment — If you've built sophisticated workflow orchestration on Temporal, migrating to Motia's event system requires deliberate planning
  • GraphQL subscription servers — Motia doesn't have native GraphQL support yet; it's HTTP/event-centric

The Observability Dividend

API design decisions have downstream consequences for debugging. One of Motia's under-appreciated advantages is that the visual Workbench isn't an add-on — it's generated from the same Step configs that define your flows.

When a Stripe webhook payment fails in production, the debugging workflow is:

  1. Open Workbench at localhost:3111 (or your deployed Workbench URL)
  2. Find the failed trace in the trace viewer
  3. See exactly which Step failed, with the input payload at that point
  4. Inject the same event through the UI to reproduce locally
  5. Fix the Step, watch the Workbench graph update, re-run

This trace-first debugging model — where every API call is a path through a visible flow graph — is qualitatively different from reading distributed logs across three different services. The flow is the documentation. The documentation is the debugger.


Getting Started: An API Route in Motia vs Express

Here's the same "create user" route in both frameworks, with the full before-and-after for adding email notification as a background job.

Express (add background job = add 4 files):

// controllers/users.ts — start here
router.post('/users', async (req, res) => {
  const user = await db.users.create(req.body)
  await emailQueue.add('welcome', { user })    // add this after wiring BullMQ
  res.status(201).json(user)
})

// queues/email.ts — new file
export const emailQueue = new Queue('email', { connection: redisConfig })

// workers/email.worker.ts — new file
const worker = new Worker('email', async (job) => {
  if (job.name === 'welcome') await sendWelcomeEmail(job.data.user)
}, { connection: redisConfig })

Motia (add background job = add 1 file, change 1 line):

// src/create-user.step.ts — add emits field
export const config: ApiRoute = {
  type: 'api',
  method: 'POST',
  path: '/users',
  emits: ['user.created'],    // add this line
}

export async function handler(req, { emit }) {
  const user = await db.users.create(req.body)
  await emit('user.created', user)    // add this line
  return { status: 201, body: user }
}

// src/send-welcome-email.step.ts — new file, that's all
export const config: EventConfig = {
  type: 'event',
  subscribes: ['user.created'],
}

export async function handler(user, { logger }) {
  await sendWelcomeEmail(user)
  logger.info('Welcome email sent', { userId: user.id })
}

One config field added. One new file. No queue definitions. No worker lifecycle management. The runtime handles everything else.


Methodology


See how Motia compares to dedicated workflow tools: Inngest vs Temporal vs Trigger.dev

Choosing an API framework for the HTTP layer Motia sits on top of: Hono vs Fastify vs Express in 2026

How event-driven API design patterns apply beyond Motia: Event-Driven APIs: Async Patterns 2026

Comments

The API Integration Checklist (Free PDF)

Step-by-step checklist: auth setup, rate limit handling, error codes, SDK evaluation, and pricing comparison for 50+ APIs. Used by 200+ developers.

Join 200+ developers. Unsubscribe in one click.