Requiring a module steps made by node
Then the follow is the logic followed by Node.js for a require('something') statement:
• If something is a core module, return it.
• If something is a relative path (starts with ‘./’ , ‘../’) return that file OR folder.
• If not, look for node_modules/filename or node_modules/foldername each level up until you
find a file OR folder that matches something.
When matching a file OR folder:, follow these steps:
• If it matched a file name, return it.
• If it matched a folder name and it has package.json with main, return that file.
• If it matched a folder name and it has an index file, return it.
Inheritence the Node.js way with
Typically in Javascript for inheritance you would use ether the prototype pattern or used the Object.create() passing it a prototype to inherit from.
In the prototypical patter you compose a constructors prototype object from the inherited classes (constructors) creating the __proto__ chain.
Object.create does the same but in a cleaner simpler way.
In node there's a way to avoid messing around with creating Constructor.prototype instances from other constructors and thus creating the __proto__ chain, its called the inherits function of a the core module util that we discussed previously.
The inherits function get two constructors as arguments, on is the parent (super/base class) the second is the inheriting constructor.
By calling the function with the desired parameters, the Parent class's prototype property will be mutated to inherit the super class prototype.
Example:
inherits(Bird, Animal);
Another thing to note is:
In order to inherit own properties on the super class the inheriting class should call the constructor inside its activation scope. example:
Bird() {
Object.call(Animal)
}
EventEmitter
events.EventEmitter:
example:
var events = require('events');
var emitter = new events.EventEmitter();
var firstCallback = function a(dataArray){/*DO SOMETHING*/}
var secondCallback = function b(dataArray){/*DO SOMETHING ELSE*/}
emitter.on('someEvent',firstCallback );
emitter.on('someEvent',secondCallback );
emitter.listeners(''SomeEvent') //returns [function a{}, function b(){}]
emitter.emit(someEvent',[1,2,3]);
emitter.removeListener('someEvent',firstCallback );
emitter.once('onStartup', function () {
console.log('this executes ones');
});
Combining EventEmitter with Node's inherit function
Inheriting from EventEmitter into your Constructor can be usefull to export an events api for your module.
Many modules build their events by inheriting from EventEmitter.
Example:
var utils_inherit = require('utils').inherit;
var Emitter = require('events').EventEmitter;
function Animal() {
Object.call(Emitter)
}
utils_inherit(Animal, Emitter)
// Animal now inherited Emitter fully
Animal.prototype.startWalking = function(distance){
this.emit('walking',['animal'])';
}
var animal = new Animal();
animal.on('walking',function(data){
console.log('animal is walking, and will stop after '+data+'meters');
});
animal.startWalking(2000)
process events you can listen to
the node process object inherits from EventEmitter and can be listened to on events it's emitting.
example:
process.on('uncaughtException', function (err) {
console.log('Caught exception: ', err);
console.log('Stack:', err.stack);
process.exit(1);
});
Some events you can listen to on process
uncaughtException - raised internal error
exit - on exiting the process, no async operations can be done at this point since the process has already been tared down.
SIGINT - emitted when the user presses Ctrl+C when a process is executing in the CLI
Streams
Streams are good for transferring large file from one locations to the other, this is because unlike a buffer streams do not wait until all the data is converted to the desired format, but instead they transfer the data by chunks.
This way the side that requested the data can almost immediately start doing something with the data, like starting to render an image before all image is recieved, or star playing a video before all video was received.
NOTE! Since streams are also event driven, they also inherit from EventEmitter, and have the emit & on methods
There are 4 main types of streams:
These classes can be found under the 'streams' core module (see Important Core Modules).
NOTE! Two noteworthy streams in node are process.stdin & process.stdout
example for creating streams from objects:
var stream = require('streams');
var readable = new stream.Readable({something:'something'});
var writable = new stream.Writable({something:'something'});
Pipes
Streams can also be piped, which means that you can for example take a read-only-stream & pipe it to a write-only-stream.
Take the simple scenario for example of serving a text file to the client.
the text file can be a read-only-stream and be pipe to the process.stdout which is a write only stream.
To create streams from files you can use the fs modules createReadStream method, and to pipe them into the process.stdout can be as simple as the following few lines:
var fs = require('fs');
var readableStream = fs.createReadStream('./cool.txt');
readableStream.pipe(process.stdout);
Pipes can also be chained for further processing of the data-chunks, for example if you want to serve a text file and zip (using a community module called zlib) it before sending it:
var fs = require('fs');
var gzip = require('zlib').createGzip(); // gzip midleware
var readableStream = fs.createReadStream('./cool.txt');
var readWriteStream = fs.createDuplexStream('./cool.txt.gz');
// the chaining
readableStream.pipe(gzip).pipe(writeableStream).pipe(process.stdout);
Writing to writeable streams
You can define a file as a writable stream, and for example inside a loop write into the file, then when loop ends call the .end method.
example:
var fs = require('fs');
var writableStream = fs.createWriteStream('message.txt');
var words = ['hello', 'stream','world'];
for (let i = 0; i < words.length; i++) {
writableStream.write(words[i]+' ');
}
writableStream.end(' .');
// file contents is now: hello stream world .