My understanding of pipe is, all piped commands are invoked in parallel and stdout of each command is fed to next command as stdin. When processing large files, initial parts of data may be done with processing, while some parts of the data are still in the earlier stages of the pipeline. Is this a correct picture of what happens?
Then, what happens when using a command (e.g sort) that needs all of its input at once, rather than working on it line by line? Will it work in small chunks and pass it forward or will it wait until previous command is done passing all data? If it does wait, how is the waiting data handled? Is it stored in RAM? Does pipe have an upper limit for the size of the data?
0 Answers