Control when data chunk flows through Node streamExtract binary values from stream with low memory consumptionnode js express: invalid request data?How do Node.js Streams work?Node CSV Parser that streams asynchronously, reads line by line and has the ability to pause and resume streamBest approach to forward upload streamReadable stream and callback functionNode.js piping streams with ajax and nodes http moduleControl the rate of data transferred through nodejs request moduleStreaming audio with Node.js with bandwidth control and dynamic track switchingHow to pipe a stream after asynchronous calls without losing data?How to limit size of chunk incoming from rest server on nodejs http module?

Am I required to correct my opponent's assumptions about my morph creatures?

How can I improve my formal definitions

Could a complex system of reaction wheels be used to propel a spacecraft?

Where should I draw the line on follow up questions from previous employer

Necessity of tenure for lifetime academic research

Unexpected behavior after assignment of function object to function wrapper

Can UV radiation be safe for the skin?

Break down the phrase "shitsurei shinakereba naranaindesu"

Storing milk for long periods of time

Coupling two 15 Amp circuit breaker for 20 Amp

What are T. S. Eliot’s “Jellicle Cats” and “Pollicle Dogs”?

Should a TA point out a professor's mistake while attending their lecture?

Who declared the Last Alliance to be the "last" and why?

What should be done with the carbon when using magic to get oxygen from carbon dioxide?

Why does the U.S. military maintain their own weather satellites?

Small RAM 4 KB on the early Apple II?

What caused the end of cybernetic implants?

Why haven't the British protested Brexit as ardently like Hong Kongers protest?

Why do presidential pardons exist in a country having a clear separation of powers?

Calculate Landau's function

Understanding data transmission rates over copper wire

New coworker has strange workplace requirements - how should I deal with them?

How do I portray irrational anger in first person?

Do universities maintain secret textbooks?



Control when data chunk flows through Node stream


Extract binary values from stream with low memory consumptionnode js express: invalid request data?How do Node.js Streams work?Node CSV Parser that streams asynchronously, reads line by line and has the ability to pause and resume streamBest approach to forward upload streamReadable stream and callback functionNode.js piping streams with ajax and nodes http moduleControl the rate of data transferred through nodejs request moduleStreaming audio with Node.js with bandwidth control and dynamic track switchingHow to pipe a stream after asynchronous calls without losing data?How to limit size of chunk incoming from rest server on nodejs http module?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








0















EDIT: I solved my own problem in this other post: Extract binary values from stream with low memory consumption



How does one precisely control the flow of a stream in NodeJS?



Take this code example from an express route:



const stream = require('stream');

class ControlStream extends stream.Transform
constructor(options)
super(options);


_transform(chunk, enc, callback)
this.push(chunk);
callback();



api.route("/stream").post((req, res) =>
let controlStream = new ControlStream();
req.pipe(controlStream);
);


In this example I am piping a request stream into a ControlStream instance which is just a subclass of Transform. The result is that data flows continuously through ControlStream.



I would like the ability to pause this data flow and instead be able to "request" each chunk of data.



For example:



const stream = require('stream');

class ControlStream extends stream.Transform
constructor(options)
super(options);


_transform(chunk, enc, callback)
this.push(chunk);
callback();


getChunk()
//requests the next chunk of data to be sent into _transform



api.route("/stream").post((req, res) =>
let controlStream = new ControlStream();
req.pipe(controlStream);

controlStream.getChunk();
);


As far as I can see, the default implementation only allows me to "listen in" on the flow of data, but I can't seem to control when that data flows, or more importantly, how much data will flow.



Thanks for your time.










share|improve this question


























  • This would just be pausing req and then resuming it and the pausing it after a single chunk is delivered and then resuming it so that the process can repeat with the next chunk. I can see how to do it but I really want to question the reason it need to be done.

    – Dan D.
    Mar 27 at 23:14












  • This is my reason why: stackoverflow.com/questions/55365136/… I'm trying to write a binary data parser that doesnt hold all the values in memory, however I need to control when a chunk enters the Transform stream in order to implement the logic I want afaik

    – mrg95
    Mar 27 at 23:16

















0















EDIT: I solved my own problem in this other post: Extract binary values from stream with low memory consumption



How does one precisely control the flow of a stream in NodeJS?



Take this code example from an express route:



const stream = require('stream');

class ControlStream extends stream.Transform
constructor(options)
super(options);


_transform(chunk, enc, callback)
this.push(chunk);
callback();



api.route("/stream").post((req, res) =>
let controlStream = new ControlStream();
req.pipe(controlStream);
);


In this example I am piping a request stream into a ControlStream instance which is just a subclass of Transform. The result is that data flows continuously through ControlStream.



I would like the ability to pause this data flow and instead be able to "request" each chunk of data.



For example:



const stream = require('stream');

class ControlStream extends stream.Transform
constructor(options)
super(options);


_transform(chunk, enc, callback)
this.push(chunk);
callback();


getChunk()
//requests the next chunk of data to be sent into _transform



api.route("/stream").post((req, res) =>
let controlStream = new ControlStream();
req.pipe(controlStream);

controlStream.getChunk();
);


As far as I can see, the default implementation only allows me to "listen in" on the flow of data, but I can't seem to control when that data flows, or more importantly, how much data will flow.



Thanks for your time.










share|improve this question


























  • This would just be pausing req and then resuming it and the pausing it after a single chunk is delivered and then resuming it so that the process can repeat with the next chunk. I can see how to do it but I really want to question the reason it need to be done.

    – Dan D.
    Mar 27 at 23:14












  • This is my reason why: stackoverflow.com/questions/55365136/… I'm trying to write a binary data parser that doesnt hold all the values in memory, however I need to control when a chunk enters the Transform stream in order to implement the logic I want afaik

    – mrg95
    Mar 27 at 23:16













0












0








0


1






EDIT: I solved my own problem in this other post: Extract binary values from stream with low memory consumption



How does one precisely control the flow of a stream in NodeJS?



Take this code example from an express route:



const stream = require('stream');

class ControlStream extends stream.Transform
constructor(options)
super(options);


_transform(chunk, enc, callback)
this.push(chunk);
callback();



api.route("/stream").post((req, res) =>
let controlStream = new ControlStream();
req.pipe(controlStream);
);


In this example I am piping a request stream into a ControlStream instance which is just a subclass of Transform. The result is that data flows continuously through ControlStream.



I would like the ability to pause this data flow and instead be able to "request" each chunk of data.



For example:



const stream = require('stream');

class ControlStream extends stream.Transform
constructor(options)
super(options);


_transform(chunk, enc, callback)
this.push(chunk);
callback();


getChunk()
//requests the next chunk of data to be sent into _transform



api.route("/stream").post((req, res) =>
let controlStream = new ControlStream();
req.pipe(controlStream);

controlStream.getChunk();
);


As far as I can see, the default implementation only allows me to "listen in" on the flow of data, but I can't seem to control when that data flows, or more importantly, how much data will flow.



Thanks for your time.










share|improve this question
















EDIT: I solved my own problem in this other post: Extract binary values from stream with low memory consumption



How does one precisely control the flow of a stream in NodeJS?



Take this code example from an express route:



const stream = require('stream');

class ControlStream extends stream.Transform
constructor(options)
super(options);


_transform(chunk, enc, callback)
this.push(chunk);
callback();



api.route("/stream").post((req, res) =>
let controlStream = new ControlStream();
req.pipe(controlStream);
);


In this example I am piping a request stream into a ControlStream instance which is just a subclass of Transform. The result is that data flows continuously through ControlStream.



I would like the ability to pause this data flow and instead be able to "request" each chunk of data.



For example:



const stream = require('stream');

class ControlStream extends stream.Transform
constructor(options)
super(options);


_transform(chunk, enc, callback)
this.push(chunk);
callback();


getChunk()
//requests the next chunk of data to be sent into _transform



api.route("/stream").post((req, res) =>
let controlStream = new ControlStream();
req.pipe(controlStream);

controlStream.getChunk();
);


As far as I can see, the default implementation only allows me to "listen in" on the flow of data, but I can't seem to control when that data flows, or more importantly, how much data will flow.



Thanks for your time.







node.js stream






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Mar 29 at 1:22







mrg95

















asked Mar 27 at 23:08









mrg95mrg95

7262 gold badges24 silver badges56 bronze badges




7262 gold badges24 silver badges56 bronze badges















  • This would just be pausing req and then resuming it and the pausing it after a single chunk is delivered and then resuming it so that the process can repeat with the next chunk. I can see how to do it but I really want to question the reason it need to be done.

    – Dan D.
    Mar 27 at 23:14












  • This is my reason why: stackoverflow.com/questions/55365136/… I'm trying to write a binary data parser that doesnt hold all the values in memory, however I need to control when a chunk enters the Transform stream in order to implement the logic I want afaik

    – mrg95
    Mar 27 at 23:16

















  • This would just be pausing req and then resuming it and the pausing it after a single chunk is delivered and then resuming it so that the process can repeat with the next chunk. I can see how to do it but I really want to question the reason it need to be done.

    – Dan D.
    Mar 27 at 23:14












  • This is my reason why: stackoverflow.com/questions/55365136/… I'm trying to write a binary data parser that doesnt hold all the values in memory, however I need to control when a chunk enters the Transform stream in order to implement the logic I want afaik

    – mrg95
    Mar 27 at 23:16
















This would just be pausing req and then resuming it and the pausing it after a single chunk is delivered and then resuming it so that the process can repeat with the next chunk. I can see how to do it but I really want to question the reason it need to be done.

– Dan D.
Mar 27 at 23:14






This would just be pausing req and then resuming it and the pausing it after a single chunk is delivered and then resuming it so that the process can repeat with the next chunk. I can see how to do it but I really want to question the reason it need to be done.

– Dan D.
Mar 27 at 23:14














This is my reason why: stackoverflow.com/questions/55365136/… I'm trying to write a binary data parser that doesnt hold all the values in memory, however I need to control when a chunk enters the Transform stream in order to implement the logic I want afaik

– mrg95
Mar 27 at 23:16





This is my reason why: stackoverflow.com/questions/55365136/… I'm trying to write a binary data parser that doesnt hold all the values in memory, however I need to control when a chunk enters the Transform stream in order to implement the logic I want afaik

– mrg95
Mar 27 at 23:16












2 Answers
2






active

oldest

votes


















0















https://nodejs.org/api/stream.html#stream_readable_pause



The readableStream component of component does have a pause functionality, which is implemented by buffering the data and stopping any 'data' events.






share|improve this answer

























  • I'm aware of .pause() however how does one allow only a single chunk of data to pass through upon resuming? Also does this properly backpressure the streams? Or does the data just pile up in memory?

    – mrg95
    Mar 27 at 23:17












  • yes there is an internal back pressure mechanism in readable streams. You can modify the highWaterMark setting of both readable and writable streams You can read a specific number of bytes of data by using the readable.read method. The specific number of Bytes can be specified as the size arg.

    – C Deuter
    Apr 2 at 21:46


















0















Since express.Request is a ReadableStream, what about async iteration (for await)?



// your function
async function process(...args)
//.....


// optional
function sleep(ms)
return new Promise(resolve => setTimeout(resolve, ms));



api.route("/stream").post(async (req, res) =>
for await (const chunk of req)
await process(chunk); // do something with the chunk
await sleep(3000); // you can also get a sleep if you need one

);


I don't know about how much the data would flow, though






share|improve this answer



























  • Thanks for your assistance! This does not backpressure the stream. The chunks sizes are irregular and go well beyond a single chunks buffer limit. This also isn't really what I need since I can't pause the stream (backpressure) and request the next chunk at will. This is basically just listening on the 'data' event

    – mrg95
    Mar 28 at 19:35













Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55387847%2fcontrol-when-data-chunk-flows-through-node-stream%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









0















https://nodejs.org/api/stream.html#stream_readable_pause



The readableStream component of component does have a pause functionality, which is implemented by buffering the data and stopping any 'data' events.






share|improve this answer

























  • I'm aware of .pause() however how does one allow only a single chunk of data to pass through upon resuming? Also does this properly backpressure the streams? Or does the data just pile up in memory?

    – mrg95
    Mar 27 at 23:17












  • yes there is an internal back pressure mechanism in readable streams. You can modify the highWaterMark setting of both readable and writable streams You can read a specific number of bytes of data by using the readable.read method. The specific number of Bytes can be specified as the size arg.

    – C Deuter
    Apr 2 at 21:46















0















https://nodejs.org/api/stream.html#stream_readable_pause



The readableStream component of component does have a pause functionality, which is implemented by buffering the data and stopping any 'data' events.






share|improve this answer

























  • I'm aware of .pause() however how does one allow only a single chunk of data to pass through upon resuming? Also does this properly backpressure the streams? Or does the data just pile up in memory?

    – mrg95
    Mar 27 at 23:17












  • yes there is an internal back pressure mechanism in readable streams. You can modify the highWaterMark setting of both readable and writable streams You can read a specific number of bytes of data by using the readable.read method. The specific number of Bytes can be specified as the size arg.

    – C Deuter
    Apr 2 at 21:46













0














0










0









https://nodejs.org/api/stream.html#stream_readable_pause



The readableStream component of component does have a pause functionality, which is implemented by buffering the data and stopping any 'data' events.






share|improve this answer













https://nodejs.org/api/stream.html#stream_readable_pause



The readableStream component of component does have a pause functionality, which is implemented by buffering the data and stopping any 'data' events.







share|improve this answer












share|improve this answer



share|improve this answer










answered Mar 27 at 23:16









C DeuterC Deuter

3631 gold badge3 silver badges12 bronze badges




3631 gold badge3 silver badges12 bronze badges















  • I'm aware of .pause() however how does one allow only a single chunk of data to pass through upon resuming? Also does this properly backpressure the streams? Or does the data just pile up in memory?

    – mrg95
    Mar 27 at 23:17












  • yes there is an internal back pressure mechanism in readable streams. You can modify the highWaterMark setting of both readable and writable streams You can read a specific number of bytes of data by using the readable.read method. The specific number of Bytes can be specified as the size arg.

    – C Deuter
    Apr 2 at 21:46

















  • I'm aware of .pause() however how does one allow only a single chunk of data to pass through upon resuming? Also does this properly backpressure the streams? Or does the data just pile up in memory?

    – mrg95
    Mar 27 at 23:17












  • yes there is an internal back pressure mechanism in readable streams. You can modify the highWaterMark setting of both readable and writable streams You can read a specific number of bytes of data by using the readable.read method. The specific number of Bytes can be specified as the size arg.

    – C Deuter
    Apr 2 at 21:46
















I'm aware of .pause() however how does one allow only a single chunk of data to pass through upon resuming? Also does this properly backpressure the streams? Or does the data just pile up in memory?

– mrg95
Mar 27 at 23:17






I'm aware of .pause() however how does one allow only a single chunk of data to pass through upon resuming? Also does this properly backpressure the streams? Or does the data just pile up in memory?

– mrg95
Mar 27 at 23:17














yes there is an internal back pressure mechanism in readable streams. You can modify the highWaterMark setting of both readable and writable streams You can read a specific number of bytes of data by using the readable.read method. The specific number of Bytes can be specified as the size arg.

– C Deuter
Apr 2 at 21:46





yes there is an internal back pressure mechanism in readable streams. You can modify the highWaterMark setting of both readable and writable streams You can read a specific number of bytes of data by using the readable.read method. The specific number of Bytes can be specified as the size arg.

– C Deuter
Apr 2 at 21:46













0















Since express.Request is a ReadableStream, what about async iteration (for await)?



// your function
async function process(...args)
//.....


// optional
function sleep(ms)
return new Promise(resolve => setTimeout(resolve, ms));



api.route("/stream").post(async (req, res) =>
for await (const chunk of req)
await process(chunk); // do something with the chunk
await sleep(3000); // you can also get a sleep if you need one

);


I don't know about how much the data would flow, though






share|improve this answer



























  • Thanks for your assistance! This does not backpressure the stream. The chunks sizes are irregular and go well beyond a single chunks buffer limit. This also isn't really what I need since I can't pause the stream (backpressure) and request the next chunk at will. This is basically just listening on the 'data' event

    – mrg95
    Mar 28 at 19:35















0















Since express.Request is a ReadableStream, what about async iteration (for await)?



// your function
async function process(...args)
//.....


// optional
function sleep(ms)
return new Promise(resolve => setTimeout(resolve, ms));



api.route("/stream").post(async (req, res) =>
for await (const chunk of req)
await process(chunk); // do something with the chunk
await sleep(3000); // you can also get a sleep if you need one

);


I don't know about how much the data would flow, though






share|improve this answer



























  • Thanks for your assistance! This does not backpressure the stream. The chunks sizes are irregular and go well beyond a single chunks buffer limit. This also isn't really what I need since I can't pause the stream (backpressure) and request the next chunk at will. This is basically just listening on the 'data' event

    – mrg95
    Mar 28 at 19:35













0














0










0









Since express.Request is a ReadableStream, what about async iteration (for await)?



// your function
async function process(...args)
//.....


// optional
function sleep(ms)
return new Promise(resolve => setTimeout(resolve, ms));



api.route("/stream").post(async (req, res) =>
for await (const chunk of req)
await process(chunk); // do something with the chunk
await sleep(3000); // you can also get a sleep if you need one

);


I don't know about how much the data would flow, though






share|improve this answer















Since express.Request is a ReadableStream, what about async iteration (for await)?



// your function
async function process(...args)
//.....


// optional
function sleep(ms)
return new Promise(resolve => setTimeout(resolve, ms));



api.route("/stream").post(async (req, res) =>
for await (const chunk of req)
await process(chunk); // do something with the chunk
await sleep(3000); // you can also get a sleep if you need one

);


I don't know about how much the data would flow, though







share|improve this answer














share|improve this answer



share|improve this answer








edited Mar 28 at 14:43

























answered Mar 28 at 14:37









rilutrilut

8011 gold badge15 silver badges31 bronze badges




8011 gold badge15 silver badges31 bronze badges















  • Thanks for your assistance! This does not backpressure the stream. The chunks sizes are irregular and go well beyond a single chunks buffer limit. This also isn't really what I need since I can't pause the stream (backpressure) and request the next chunk at will. This is basically just listening on the 'data' event

    – mrg95
    Mar 28 at 19:35

















  • Thanks for your assistance! This does not backpressure the stream. The chunks sizes are irregular and go well beyond a single chunks buffer limit. This also isn't really what I need since I can't pause the stream (backpressure) and request the next chunk at will. This is basically just listening on the 'data' event

    – mrg95
    Mar 28 at 19:35
















Thanks for your assistance! This does not backpressure the stream. The chunks sizes are irregular and go well beyond a single chunks buffer limit. This also isn't really what I need since I can't pause the stream (backpressure) and request the next chunk at will. This is basically just listening on the 'data' event

– mrg95
Mar 28 at 19:35





Thanks for your assistance! This does not backpressure the stream. The chunks sizes are irregular and go well beyond a single chunks buffer limit. This also isn't really what I need since I can't pause the stream (backpressure) and request the next chunk at will. This is basically just listening on the 'data' event

– mrg95
Mar 28 at 19:35

















draft saved

draft discarded
















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55387847%2fcontrol-when-data-chunk-flows-through-node-stream%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

Swift 4 - func physicsWorld not invoked on collision? The Next CEO of Stack OverflowHow to call Objective-C code from Swift#ifdef replacement in the Swift language@selector() in Swift?#pragma mark in Swift?Swift for loop: for index, element in array?dispatch_after - GCD in Swift?Swift Beta performance: sorting arraysSplit a String into an array in Swift?The use of Swift 3 @objc inference in Swift 4 mode is deprecated?How to optimize UITableViewCell, because my UITableView lags

Access current req object everywhere in Node.js ExpressWhy are global variables considered bad practice? (node.js)Using req & res across functionsHow do I get the path to the current script with Node.js?What is Node.js' Connect, Express and “middleware”?Node.js w/ express error handling in callbackHow to access the GET parameters after “?” in Express?Modify Node.js req object parametersAccess “app” variable inside of ExpressJS/ConnectJS middleware?Node.js Express app - request objectAngular Http Module considered middleware?Session variables in ExpressJSAdd properties to the req object in expressjs with Typescript