up-down-left-rightAsynchronous Processing

Asynchronous processing is a cornerstone of Happen's design, allowing the framework to handle concurrent events efficiently while maintaining its commitment to radical simplicity. This document explains how Happen approaches asynchronous processing through multiple strategies: concurrent event handling, direct runtime capabilities, function-based flows, and streaming results.

Core Approach

In Happen, multiple events arriving at a node are processed concurrently without blocking each other, leveraging JavaScript's asynchronous capabilities while preserving event isolation. This happens as an implementation detail rather than an explicit API feature.

When events arrive at a node:

  1. Each event gets its own independent execution context

  2. Events are processed concurrently using JavaScript's natural Promise mechanism

  3. One event processing never blocks another, even within the same node

This approach ensures that slow-running event handlers don't create bottlenecks for unrelated events.

The Functional Asynchronous Model

Happen embraces JavaScript's native asynchronous capabilities while providing a clean, functional interface through the Event Continuum model:

// Register an asynchronous handler
orderNode.on("process-order", async (event, context) => {
  // Perform asynchronous validation
  const validationResult = await validateOrderData(event.payload);
  
  // Store in context
  context.validation = validationResult;
  
  if (!validationResult.valid) {
    return {
      success: false,
      reason: "validation-failed",
      errors: validationResult.errors
    };
  }
  
  // Return the next function to execute
  return processPayment;
});

// Asynchronous function in the flow
async function processPayment(event, context) {
  // Process payment asynchronously
  const paymentResult = await processTransaction(event.payload.payment);
  
  // Store in context
  context.payment = paymentResult;
  
  if (!paymentResult.success) {
    return handlePaymentFailure;
  }
  
  // Return next function
  return createShipment;
}

Happen's event execution engine handles the async nature transparently:

  1. When a function returns a Promise, Happen awaits it automatically

  2. Async functions are fully supported throughout the flow

  3. Error handling works seamlessly with async/await

Automatic Queue Management

For high-volume scenarios, Happen includes internal queue management to prevent overwhelming the system:

This queue management:

  1. Limits maximum concurrent event handling to prevent resource exhaustion

  2. Preserves ordered processing within each event type

  3. Provides backpressure for high-volume scenarios

  4. Manages memory usage by controlling queue size

Node Configuration

Happen manages concurrency through the system configuration. This approach keeps infrastructure concerns separated from domain logic and provides consistent behavior across your application.

System-Level Concurrency Configuration

Concurrency settings are defined during framework initialization:

This hierarchical approach ensures consistent behavior while allowing for customization where needed.

Generator-Based Streaming

Building on the foundation of asynchronous processing, Happen supports generator-based streaming for incremental data processing.

Instead of introducing separate API methods for streaming, Happen detects generator functions automatically:

When interacting with a generator-based handler, Happen automatically provides an AsyncIterator interface:

Generator functions also work with broadcasting, enabling publish-subscribe patterns for streams:

Use Cases for Streaming

Generator-based streaming in Happen addresses several key use cases:

1. Large Dataset Processing

Break down large datasets into manageable chunks:

2. Real-time Progress Updates

Provide incremental feedback for long-running operations:

3. Continuous Data Streams

Handle data from sources that produce continuous updates:

4. Paginated API Results

Fetch and process paginated data from external APIs:

Benefits of Happen's Async Processing Model

The combination of concurrent event processing and generator-based streaming delivers several key benefits:

  1. Maximized Throughput: Multiple events process concurrently without blocking

  2. Resource Efficiency: Memory usage is controlled through incremental processing

  3. Responsive System: Long-running operations don't block other processing

  4. Natural Backpressure: Consumers process stream results at their own pace

  5. Minimal API Surface: Power and flexibility without additional primitives

  6. Clean Isolation: Each processing path maintains its own context and error boundaries

Choosing the Right Concurrency Approach

With multiple approaches to concurrency available in Happen, here's a quick guide to choosing the right one for different scenarios:

Approach
Best For
When To Use

Functional Flows

Most use cases

Default approach for all event handling

Async/Await

Asynchronous operations

When operations need to wait for external results

Generator-based Streaming

Incremental processing

For progress updates and chunked processing

Worker Threads

CPU-intensive tasks

When tasks would block the main thread

Distributed Processing

Horizontal scaling

When single machine capacity is exceeded

Key Takeaways

  1. The Event Continuum model naturally supports asynchronous processing through JavaScript's native async/await

  2. Runtime transparency allows direct use of platform-specific concurrency features

  3. Generator-based streaming provides powerful incremental processing with minimal API additions

  4. Internal queue management ensures system stability under high load

This multi-faceted approach delivers flexible asynchronous capabilities by providing multiple concurrency strategies that compose seamlessly.

Last updated