A better streams API is possible for JavaScript
Network protocols vs stream abstractions
- One early tangent argues that the real problem is trying to treat everything as TCP-like byte streams instead of exposing UDP and more suitable low-level primitives.
- Others counter that TCP/UDP are orthogonal: the Web Streams API is a general abstraction over any byte-producing source (files, audio, network, etc.), and browsers already expose UDP-like capabilities through WebRTC data channels (though not raw UDP).
Performance, BYOB, and buffer management
- BYOB (“bring your own buffer”) reads are widely seen as powerful but overly complex; they significantly reduce GC pressure and copies for large transfers but are hard to use correctly.
- Some commenters suggest simpler reuse schemes (e.g.,
stream.returnChunk(chunk)) or linear/affine types to enforce consumption and reuse, but note that mainstream JS can’t express these guarantees.
Alternative stream API designs
- A major subthread centers on a proposed
Stream<T>wherenext()can return either{done, value: T}or aPromiseof that, allowing sync where possible and async only when needed. - Proponents say this unifies sync/async, avoids writing everything twice, and enables “async-batched” behavior.
- Critics argue this is a leaky, hard-to-reason-about abstraction (violates uniform async semantics), and that the primitive should stay “async iterator of Uint8Array” with higher-level abstractions layered on top.
GC, per-item overhead, and await cost
- There’s heated debate over per-byte object allocation: some say generational GCs make many tiny short-lived objects acceptable; others call per-byte objects “insane” for high-throughput I/O.
- Several note that the main cost is often
await/microtask scheduling, not the Promise object itself; microbenchmarks suggest large slowdowns when very fine-grained async is used unless data is buffered into larger chunks.
Critique of article style, AI use, and benchmarks
- Multiple comments complain the prose “sounds like LLM output” and associate that style with low-effort content, especially after previous Cloudflare AI incidents.
- The author acknowledges using an AI assistant for some wording but claims the ideas are his and the result was proofread.
- Some find the article clear and useful; others question technical rigor, especially benchmarks claiming throughput far above the hardware’s memory bandwidth, suggesting “vibecoded” measurements.
Current Web Streams pain points & ecosystem context
- Many describe Web Streams (especially in Node) as awkward: too much hidden Promise creation, confusing backpressure, and surprising behaviors like
ReadableStream.tee()slowing to the slowest branch in non-intuitive ways. - There are calls for simpler, Go-like
read(buffer)/writeinterfaces for raw byte I/O, possibly alongside a richer “value stream” abstraction. - Several point to prior art: Node’s original
.pipe(), Deno’s earlier Go-inspired APIs, Observables, pull-stream, transducers, Kotlin Flows, .NETIAsyncEnumerable, Okio, and libraries like Repeater or Effect as evidence that better ergonomics and unification are possible. - Some see this proposal as a step toward a unified, async-aware, pull-based abstraction that could avoid the split seen in other ecosystems between synchronous streams and separate reactive APIs.