Quick intro the main concepts of the Node.js
Event Loop does a simple job - it orchestrates the work between the Call Stack and Event Queue untill the Event Queue is empty.
function foo() {
setTimout(() => { // This callback is putted on the Call Stack
console.log('process after 2 sec.'); // when it's done after 2 seconds it will be placed into the Queue.
// Exactly at this moment the Event Loop has smth. important todo,
// The Event Loop job is very simple - it monitors the stack of Call Stack
// and Event Queue. When the Call Stack is empty and the Event Queue is not
// it will de-queue the event from the Event Queue and place it on the Call Stack.
// Call Stack will invoke this callback for assosiative function and pop out this
// callback out of the Call Stack. Call Stack and Event Queue are empty now, event
// so, Event Loop doesn't need to process anymore. All Node APIs works with this
// concept.
}, 2000);
}
logger.emit('error:event')
and register listener functions logger.on('error:event', listenerFn)
Event Emitter vs. Callback ?
node debug index.js
, debugger will listen on 127.0.0.1:5858.help
to see available commands for debugger.cont
to continue debugging.restart
to restart debugging.sb(2 /*line number*/)
to put a breakpoint where debugger will stop.repl
when breakpoint is activated to inspect anything accessable to the script at that point. Ex. type argument name of the function or varialble name and it repl
will output the result.watch(arg /*variable name to watch*/)
to watch a value of the variable and do not breake everytime. It is handy when debugging loops.run node --inspect --debug-brk index.js
it will output the url, cp this url and paste into your browser and you have fully functionall debugger with a lot of features.
Working with big amount of data in Node.js means working with Streams.
Streams in Node.js gives you the power of composability in your code, just like you can compose powerfull Linux commands by piping other smaller commands, ex: $ git grep require | grep -v // wc -l
, so you can do the same in Node.js with Streams.
Streams are simply collections of data that might not be available all at once and don't have to fit in memory.
There are four fundamental type of Streams in Node.js: Redable, Writable, Duplex and Transform.
A Readable stream is an abstraction for a source from which data can be consumed.
An example of that is fs.createReadStream
function of fs
module.
Readable Stream events:
data
- emitted whenever the stream passes a chunk of data to the cunsumerend
, - emitted when there is no more data to be consumed from the streamerror
,clsoe
,readable
.Readable Stream functions:
pipe()
, unpipe()
,read()
, unshift()
, resume()
,pause()
, isPaused()
setEncoding()
.Readable Streams can be either in poused mode or in flowing mode. These are something referred to as pull vs. push modes.
A Writable stream is an abstraction for a destination to which data can be written.
An example of that is fs.createWriteStream
function of fs
module.
Writable Stream events:
drain
- a signal that the writable stream can receive more datafinish
- emitted when the all the data has been flashed to underlying system.error
,close
,pipe
,unpipe
.Writable Stream functions:
write()
,end()
,cork()
,uncork()
,setDefaultEncoding()
.Duplex streams are both Readable and Writable, like a socket for example.
Transform streams are basically Duplex streams that can be used to modify or transform the data as it is written and read.
An example of that is the zlib
createGzip
stream to compress the data using gzip
.
You can think of Transsform stream as a function where the input is the Writable stream part and the output is the Readable stream part.
All Streams are instances of EventEmitter. They all emit events that we can use to write or read data from them. However, we can consume streams in a simple wat using pipe
method, ex: src.pipe(dst) /* src - readable stream, dst - writable stream
.
Linux: a | b | c | d
Node: a.pipe(b).pipe(c).pipe(d);
or a.pipe(b); b.pipe(c); c.pipe(d);
Stream implementers are usually who use a stream
module.
For consuming, all we have todo is either use pipe or listen to stream events
const { Writable } = require('stream');
// it will echo whatever you type to the console.
const echoStream = new Writable({
write(chunck, encoding, callback) { // required option to implement writable stream.
console.log(chunck.toString());
callback();
}
})
To consume above stream, we can simply use process.stdin.pipe(echoStream)
;
echosStream
is not really usefull, the same echo functionality we can implement by using process.stdout
:
process.stdin.pipe(process.stdout)
, now it will do the same echo whatever you type to console.