Batch Async Pattern¶
Problem¶
Using Promise.all() with large arrays of async operations can overwhelm databases and API gateways:
// DANGEROUS: 50+ concurrent requests can crash Kong/Supabase
await Promise.all(widgets.map(w => supabase.from('widgets').update(w)))
This caused Kong to crash with 29+ concurrent requests in home-portal.
Solution¶
Use batched concurrency to limit simultaneous requests:
import { batchAsync, BATCH_SIZES } from '@/lib/utils/batch-async'
await batchAsync(
widgets,
async (w) => supabase.from('widgets').update(w),
{ batchSize: BATCH_SIZES.SUPABASE }
)
The Utility¶
Located at: src/lib/utils/batch-async.ts
Functions¶
batchAsync(items, fn, options) - Process with batched concurrency
sequential(items, fn) - Process one at a time
Options¶
| Option | Default | Description |
|---|---|---|
batchSize |
5 | Concurrent operations per batch |
delayBetweenBatches |
0 | Milliseconds between batches |
stopOnError |
true | Stop processing on first error |
Recommended Batch Sizes¶
import { BATCH_SIZES } from '@/lib/utils/batch-async'
BATCH_SIZES.SUPABASE // 5 - PostgREST/Kong operations
BATCH_SIZES.EXTERNAL_API // 3 - Third-party APIs
BATCH_SIZES.FILE_SYSTEM // 10 - File operations
BATCH_SIZES.CPU_BOUND // 1 - Sequential only
When to Use¶
Use batchAsync when:
- Processing arrays of 10+ items with async operations
- Making database updates/inserts in bulk
- Calling external APIs in a loop
- Any Promise.all() with unbounded array size
Use sequential when:
- Order of operations matters
- Each operation depends on the previous
- Debugging concurrent issues
Examples¶
Bulk Database Update¶
// API route handler
export async function PUT(request: Request) {
const { items } = await request.json()
await batchAsync(
items,
async (item) => {
await supabase.from('table').update(item).eq('id', item.id)
},
{ batchSize: BATCH_SIZES.SUPABASE }
)
return NextResponse.json({ success: true })
}
With Error Handling¶
const { results, errors, totalProcessed } = await batchAsync(
items,
async (item) => processItem(item),
{ stopOnError: false }
)
if (errors.length > 0) {
console.error(`${errors.length} items failed:`, errors)
}
Rate-Limited API¶
await batchAsync(
urls,
async (url) => fetch(url),
{
batchSize: BATCH_SIZES.EXTERNAL_API,
delayBetweenBatches: 100 // 100ms between batches
}
)
Anti-Patterns¶
// BAD: Unbounded concurrency
await Promise.all(items.map(i => process(i)))
// BAD: No concurrency control in loops
for (const item of items) {
process(item) // Missing await = concurrent
}
// GOOD: Batched concurrency
await batchAsync(items, process, { batchSize: 5 })
// GOOD: Sequential when needed
await sequential(items, process)
Performance Notes¶
- Batch size of 5 provides good throughput without overwhelming most services
- For 29 items with batchSize=5: 6 batches, ~6 round-trips vs 29 with sequential
- Adjust batch size based on target service capacity
- Monitor for 429 (rate limit) or 503 (overload) errors and reduce batch size