Solution: Asynchronous Streaming Adapter
Explore how to design an asynchronous streaming adapter that converts a legacy callback-driven file reader into a modern async iterator. Understand how to manage event-driven data flow with queues and promises to synchronize push and pull timing models, enabling clean async iteration in Node.js. This lesson equips you to integrate legacy streaming APIs with modern async code using the adapter pattern.
We'll cover the following...
Solution explanation
Lines 4–15: We define a
legacyReaderthat reads data from a real file using Node’sfsmodule.It opens a readable stream with
fs.createReadStream(filename, { encoding: 'utf-8' }).On each
'data'event, it passes the received chunk to the callback.When the stream ends, it signals completion by calling the callback with
null.On errors, it logs the issue and also signals completion.
This mirrors how many legacy streaming APIs work—callback-driven and event-based.
Lines 17–40: The
StreamAdapterconverts that push model into a pull-based async iterator. This design lets the adapter synchronize two mismatched timing models:The constructor initializes three critical internal properties:
queue: a buffer that temporarily stores chunks that have already arrived but haven’t yet been requested by the consumer.resolveNext: a reference to a promise resolver function. This is used when the consumer has callednext()but no new data has arrived yet—essentially “parking” the ...