Skip to content

Commit b6737a6

Browse files
author
James Halliday
committed
stream intro, through2
1 parent 2b4abb4 commit b6737a6

File tree

1 file changed

+297
-0
lines changed

1 file changed

+297
-0
lines changed

streams.markdown

+297
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,297 @@
1+
# streams
2+
3+
node.js has a handy interface for shuffling data around
4+
called streams
5+
6+
---
7+
# reference materials!
8+
9+
https://github.com/substack/stream-handbook
10+
http://nodeschool.io/#stream-adventure
11+
12+
---
13+
# stream origins
14+
15+
```
16+
"We should have some ways of connecting programs like garden
17+
hose--screw in another segment when it becomes necessary to
18+
massage data in another way. This is the way of IO also."
19+
```
20+
21+
[Doug McIlroy. October 11, 1964](
22+
http://cm.bell-labs.com/who/dmr/mdmpipe.html)
23+
24+
---
25+
# why streams?
26+
27+
* we can compose streaming abstractions
28+
* we can operate on data chunk by chunk
29+
30+
---
31+
# composition
32+
33+
Just like how in unix we can pipe commands together:
34+
35+
```
36+
$ <mobydick.txt sed -r 's/\s+/\n/g' | grep -i whale | wc -l
37+
1691
38+
```
39+
40+
we can pipe abstractions together with streams using
41+
`.pipe()`:
42+
43+
``` js
44+
read('mobydick.txt')
45+
.pipe(replace(/\s+/g, '\n'))
46+
.pipe(filter(/whale/i))
47+
.pipe(linecount(function (count) {
48+
console.log(count)
49+
}))
50+
;
51+
```
52+
53+
---
54+
# chunk by chunk
55+
56+
With streams, we can operate on data chunk by chunk, without
57+
buffering everything into memory.
58+
59+
This means we can write programs that operate on very large
60+
files!
61+
62+
It also means we can have hundreds or thousands of
63+
concurrent streams without using much memory.
64+
65+
---
66+
# fs
67+
68+
We can read a file and stream the file contents to stdout:
69+
70+
``` js
71+
var fs = require('fs');
72+
73+
74+
fs.createReadStream('greetz.txt')
75+
.pipe(process.stdout)
76+
;
77+
```
78+
79+
---
80+
81+
```
82+
$ echo beep boop > greetz.txt
83+
$ node greetz.js
84+
beep boop
85+
```
86+
87+
---
88+
89+
now let's transform the data before we print it out!
90+
91+
---
92+
# fs
93+
94+
You can chain `.pipe()` calls together just like the `|`
95+
operator in bash:
96+
97+
``` js
98+
var fs = require('fs');
99+
100+
101+
fs.createReadStream('greetz.txt')
102+
.pipe(...)
103+
.pipe(process.stdout)
104+
;
105+
```
106+
107+
---
108+
# fs
109+
110+
``` js
111+
var fs = require('fs');
112+
var through = require('through2');
113+
114+
fs.createReadStream('greetz.txt')
115+
.pipe(through(toUpper))
116+
.pipe(process.stdout)
117+
;
118+
119+
function toUpper (buf, enc, next) {
120+
var up = buf.toString().toUpperCase();
121+
122+
123+
}
124+
```
125+
126+
---
127+
# fs
128+
129+
``` js
130+
var fs = require('fs');
131+
var through = require('through2');
132+
133+
fs.createReadStream('greetz.txt')
134+
.pipe(through(toUpper))
135+
.pipe(process.stdout)
136+
;
137+
138+
function toUpper (buf, enc, next) {
139+
var up = buf.toString().toUpperCase();
140+
this.push(up);
141+
142+
}
143+
```
144+
145+
---
146+
# fs
147+
148+
``` js
149+
var fs = require('fs');
150+
var through = require('through2');
151+
152+
fs.createReadStream('greetz.txt')
153+
.pipe(through(toUpper))
154+
.pipe(process.stdout)
155+
;
156+
157+
function toUpper (buf, enc, next) {
158+
var up = buf.toString().toUpperCase();
159+
this.push(up);
160+
next();
161+
}
162+
```
163+
164+
---
165+
# fs
166+
167+
```
168+
$ node greetz.js
169+
BEEP BOOP
170+
```
171+
172+
---
173+
# stdin
174+
175+
What if we want to read from stdin instead of a file?
176+
Just pipe from `process.stdin` instead of
177+
`fs.createReadStream()`.
178+
179+
---
180+
181+
before:
182+
183+
``` js
184+
var fs = require('fs');
185+
var through = require('through2');
186+
187+
fs.createReadStream('greetz.txt')
188+
.pipe(through(toUpper))
189+
.pipe(process.stdout)
190+
;
191+
192+
function toUpper (buf, enc, next) {
193+
var up = buf.toString().toUpperCase();
194+
this.push(up);
195+
next();
196+
}
197+
```
198+
199+
---
200+
201+
after:
202+
203+
``` js
204+
205+
var through = require('through2');
206+
207+
process.stdin
208+
.pipe(through(toUpper))
209+
.pipe(process.stdout)
210+
;
211+
212+
function toUpper (buf, enc, next) {
213+
var up = buf.toString().toUpperCase();
214+
this.push(up);
215+
next();
216+
}
217+
```
218+
219+
---
220+
# through2
221+
222+
through2 is a module you can install with npm:
223+
224+
```
225+
$ npm install through2
226+
```
227+
228+
It makes setting up a transform stream less verbose than
229+
using the core methods.
230+
231+
---
232+
233+
a version of our program using core streams:
234+
235+
``` js
236+
var Transform = require('stream').Transform;
237+
var toUpper = new Transform;
238+
toUpper._transform = function (buf, enc, next) {
239+
var up = buf.toString().toUpperCase();
240+
this.push(up);
241+
next();
242+
};
243+
244+
process.stdin
245+
.pipe(toUpper)
246+
.pipe(process.stdout)
247+
;
248+
```
249+
250+
---
251+
# through2 vs stream.Transform
252+
253+
rules of thumb:
254+
255+
* use through when you only want to transform some data
256+
* use core Transform when you want to use inheritance
257+
258+
---
259+
# through(write, end)
260+
261+
With through there are 2 parameters: `write` and `end`.
262+
Both are optional.
263+
264+
* `function write (buf, enc, next) {}`
265+
* `function end () {}`
266+
267+
Call `next()` when you're ready for the next chunk.
268+
If you don't call `next()`, your stream will hang!
269+
270+
Call `this.push(VALUE)` inside the callback to put VALUE
271+
into the stream's output.
272+
273+
Use a `VALUE` of `null` to end the stream.
274+
275+
---
276+
277+
---
278+
# concat-stream
279+
280+
281+
282+
---
283+
# http
284+
285+
---
286+
# readable
287+
288+
---
289+
# writable
290+
291+
---
292+
# through
293+
294+
```
295+
process.stdin.pipe(process.stdout)
296+
```
297+

0 commit comments

Comments
 (0)