Node.js Público

Node.js

Tomas Katz
Curso por Tomas Katz, actualizado hace más de 1 año Colaboradores

Descripción

Node.js

Información de los módulos

Descripción

Node - Performance
Node Performance Traditionally servers used a new process to handle each new server request. The problem with this was that creating a new process is a memory expensive procedure, the process has to be initialized and destroyed, all of which taking up RAM. This could not be scaled for very long, and an alternative was to use thread pools instead, retrieving a new thread for each request. This way we don’t pay the panelty for creating new OS processes since we are using an existing thread pool, also threads are lighter than processes, still using threads takes up RAM and also switching context between threads uses unnecesarry CPU cycles. Nginx Nginx Brought in the idea of single threaded servers, an idea which Nginx uses. Nginx proved that single thread servers can handle a huge amount of request and efficiently. Nginx is the predecessor of Node regarding the usage of single threaded server. A compersion between apache and Nginx showed an undeniable advantage in performance of Nginx over Apache regarding handling of requests, and also in RAM usage.
Mostrar menos

Descripción

Node - Why Is Node More Performent
Why Is Node More Preforment? No overhead of creating processes, assigning threads, and switching context between threads, this is since execution uses a single thread which is dedicated to executing javascript in its entirety. Javascripts native support of first-class function and the use of call-back functions and closures, enables fast & correct access to results of long running operations. Node actually enables using all the CPU cores (if lets say your using quad core), and this is done quite simply with node js.
Mostrar menos

Descripción

Node - Internals
The event loop and thread sturvation  Just like in browser environment, at the heart of handling events on the server is the Event Loop, which runs in the background checking for events and executing javascript functions that are assigned to that event. This also brings up the issue of thread starvation, known in browser also as blocking UI, which happens when there are no CPU resources left for handling additional operations. Thread starvations happens because of heavy, long lasting operations which block the OS from assigning new threads or in nodes example blocking the single thread from reaching other tasks on the queue. You could say that node is the wrong choice for heavy server operations, but when taking in consideration the above the same issue will be encountered on other servers as well. The thing is, heavy lifting operation should be off-loaded to offline execution or handling of Databases in the first place.   The file system Node.js uses the CommonJS module loading system.  In CommonJS every file stands as a module on its own, and has a module object defined for it (in the scope of its execution), which can be accessed only buy the defining file. The module object has an exports object defined on it, which provides the API object of the file module. When other files use the globally defined require() function, by providing it a path to file, the script is executed and only the module.exports object is defined. This is desired since the module.exports also acts as a sort of a name-space, importing modules safely without name collisions. NOTE! an import statement should provide a path that is relative, and without the .js mime type. This is done for consistency reasons between server and client, where the client does not know what file types the server is using (c#/Typescript/Js).
Mostrar menos

Descripción

Important Globals
Sin etiquetas
console timers (setInterval, setTimeout, clearInterval, clearTimeout) _filename, _dirname (local global, specific to file scope) This variable exists in every files scope, and provide the full path to the file & directory. process process.argv (for command line arguments) When a file is executed using the command line ( for example an NPM start script that passes parameters to an installed package) and passed parameters, these parameters can be accessed inside the file using the process.argv which is an array of strings. The first element of the array will be the name of the process running the script, mainly it will be "node". The  second element of the array will be the full file path to the script thats executing, for example '/path/to/file/on/your/filesystem/argv.js'. The elements following will all be the different arguments passed into the execution command. so an example for the returned array from process.argv  will be something like: [ 'node', '/path/to/file/on/your/filesystem/argv.js', 'foo', 'bar', 'bas' ]. to get only the arguments you can splice the array from the 3rd element to the end of the array.    process.nextTick (event loop callback) A highly efficient and simple function, that takes a callback function which is added to the start of every event loop Q execution. This callback function will always be the first task in an event loop Q execution. What important to notice about this function, is that its the first javascript function to execute in every event loop iteration, everything before it is C and everything after it is Javascrpit. stdin - NOTE: RESEARCH stdout - NOTE: RESEARCH Buffer (global class) A globally defined class used to convert data between different types of encoding, for example UTF to ASCII, or ASCII to binary.  example: var buffer = new Buffer(str, 'utf-8'); var roundTrip = buffer.toString('utf-8'); global - just window object in a browser, can be accessed from anywhere in the application.
Mostrar menos

Descripción

Important Core Modules
Sin etiquetas
path path.normalize path.join path.dirname, path. basename, path.extname fs (file-system) os (operating-system) util (general utilities) util.inherits util.log util.isArray util.isDate util.isError require (function) events stream stream.Readable stream.Writebale stream.Duplex stream.Transform NOTE! all the above inherit from Stream which you should not use directly. NOTE! Stream inherits from EventEmitter     Requiring a module steps made by node Then the follow is the logic followed by Node.js for a require('something') statement: • If something is a core module, return it. • If something is a relative path (starts with ‘./’ , ‘../’) return that file OR folder. • If not, look for node_modules/filename or node_modules/foldername each level up until you find a file OR folder that matches something. When matching a file OR folder:, follow these steps: • If it matched a file name, return it. • If it matched a folder name and it has package.json with main, return that file. • If it matched a folder name and it has an index file, return it.
Mostrar menos

Descripción

Inheritence, Events & Streams
Sin etiquetas
Inheritence the Node.js way with  Typically in Javascript for inheritance you would use ether the prototype pattern  or used the Object.create() passing it a prototype to inherit from. In the prototypical patter you compose a constructors prototype object from the inherited classes (constructors) creating the __proto__ chain. Object.create does the same but in a cleaner simpler way. In node there's a way to avoid messing around with creating Constructor.prototype instances from other constructors and thus creating the __proto__ chain, its called the inherits function of a the core module util that we discussed previously. The inherits function get two constructors as arguments, on is the parent (super/base class)  the second is the inheriting constructor. By calling the function with the desired parameters, the Parent class's prototype property will be mutated to inherit the super class prototype. Example: inherits(Bird, Animal); Another thing to note is: In order to inherit own properties on the super class the inheriting class should call the constructor inside its activation scope. example: Bird() {    Object.call(Animal) } EventEmitter  events.EventEmitter: example: var events = require('events'); var emitter = new events.EventEmitter(); var firstCallback = function a(dataArray){/*DO SOMETHING*/} var secondCallback = function b(dataArray){/*DO SOMETHING ELSE*/} emitter.on('someEvent',firstCallback ); emitter.on('someEvent',secondCallback ); emitter.listeners(''SomeEvent') //returns [function a{}, function b(){}] emitter.emit(someEvent',[1,2,3]); emitter.removeListener('someEvent',firstCallback ); emitter.once('onStartup', function () {    console.log('this executes ones'); });   Combining EventEmitter with Node's inherit function Inheriting from EventEmitter into your Constructor can be usefull to export an events api for your module. Many modules build their events by inheriting from EventEmitter. Example: var utils_inherit = require('utils').inherit; var Emitter = require('events').EventEmitter;  function Animal() {    Object.call(Emitter) } utils_inherit(Animal, Emitter) // Animal now inherited Emitter fully Animal.prototype.startWalking = function(distance){    this.emit('walking',['animal'])'; } var animal = new Animal(); animal.on('walking',function(data){         console.log('animal is walking, and will stop after '+data+'meters'); }); animal.startWalking(2000)   process events you can listen to the node process object inherits from EventEmitter and can be listened to on events it's emitting. example: process.on('uncaughtException', function (err) {       console.log('Caught exception: ', err);       console.log('Stack:', err.stack);       process.exit(1); }); Some events you can listen to on process   uncaughtException - raised internal error exit - on exiting the process, no async operations can be done at this point since the process has already been tared down. SIGINT - emitted when the user presses Ctrl+C when a process is executing in the CLI   Streams  Streams are good for transferring large file from one locations to the other, this is because unlike a buffer streams do not wait until all the data is converted to the desired format, but instead they transfer the data by chunks.  This way the side that requested the data can almost immediately start doing something with the data, like starting to render an image before all image is recieved, or star playing a video before all video was received. NOTE! Since streams are also event driven, they also inherit from EventEmitter, and have the emit & on methods There are 4 main types of streams: Readable - stream that can only be read Writable - streams that can only written to Duplex - streams that can be both readable & writable Transform - streams that need to do some procedure on the received data before transporting it, for example encryption or compressing.  These classes can be found under the 'streams' core module (see Important Core Modules). NOTE! Two noteworthy streams in node are process.stdin & process.stdout  example for creating streams from objects: var stream = require('streams'); var readable = new stream.Readable({something:'something'}); var writable = new stream.Writable({something:'something'}); Pipes Streams can also be piped, which means that you can for example take a read-only-stream & pipe it to a write-only-stream. Take the simple scenario for example of serving a text file to the client.  the text file can be a read-only-stream and be pipe to the process.stdout which is a write only stream. To create streams from files you can use the fs modules createReadStream method, and to pipe them into the process.stdout can be as simple as the following few lines: var fs = require('fs'); var readableStream = fs.createReadStream('./cool.txt'); readableStream.pipe(process.stdout); Pipes can also be chained for further processing of the data-chunks, for example if you want to serve a text file and zip (using a community module called zlib) it before sending it: var fs = require('fs'); var gzip = require('zlib').createGzip(); // gzip midleware var readableStream = fs.createReadStream('./cool.txt'); var readWriteStream = fs.createDuplexStream('./cool.txt.gz'); // the chaining readableStream.pipe(gzip).pipe(writeableStream).pipe(process.stdout); Writing to writeable streams You can define a file as a writable stream, and for example inside a loop write into the file, then when loop ends call the .end method. example: var fs = require('fs'); var writableStream = fs.createWriteStream('message.txt'); var words = ['hello', 'stream','world']; for (let i = 0; i < words.length; i++) {       writableStream.write(words[i]+' '); } writableStream.end(' .'); // file contents is now: hello stream world .
Mostrar menos
Sin etiquetas
Sin etiquetas

Descripción

Node HTTP basics
Sin etiquetas
Main core modules for creating web applications: • net / require('net'): provides the foundation for creating TCP server and clients • dgram / require('dgram'): provides functionality for creating UDP / Datagram sockets • http / require('http'): provides a high-performing foundation for an HTTP stack • https / require('https'): provides an API for creating TLS / SSL clients and servers
Mostrar menos

Descripción

Express
Sin etiquetas
Express Express is built on the top of the core module "connect", and has all its basic features, such as using the next() method , or the createServer() method, or the use()  for middle-ware. All connect middle-ware can be used with express, but not all express middle-ware can be used with connect, this is since express modifies the reques & response objects. Popular connect/express middle-ware serve-static npm install serve-static included with express using the .static() method. Sets a server to serve as the static resources directory. Excellent header support, for example sets headers automatically for 200, 304 (not modified), 400, 500 has an optional secondary object param, that includes an index property, which is an array of permitted files to serve. Example: app.use(serveStatic(__dirname + '/public', {'index': ['default.html', 'default.htm']}))   serve-index npm install serve-index. by default serves and Html page with the listings of a directory. if the .static() middle-ware is used previously, it will enable the user to get the index.html file before getting the directory listing. body-parser parse string request bodies into javascript objects. attaches parsed body to the request.body object. raises error on invalid JSON string sent to the server as request body, an error which can be handled using error handling middle-ware. cookie-parser Cookies: a string of data sent to client, for the client to use, which sometimes can be modified on the client and sent back to the server, in the requests header, under the Cookie header. Cookies are a way to keep client state on server since Http is a stateless protocol. Cookies are stored on the browser and are domain specific, meaning only the domain that sent the document can access the cookies it set. By default express has built in feature to read cookies from the header: request.headers['cookie']. By default express has a built in feature to write cookies into the response header using res.cookie(key,value). will provide a response.cookie(key,value) method, for easier  Adding in cookie-parser into the express queue: will parse the request.headers into a javascript object and attach it to request.cookies Cookie signing : Since cookies are stored on the client, they can be forged by 3rd side malicious client scripts & by server side hackers that realize how the cookie generation logic works on the server side, this is even though cookies are domain specific by default as part of the browsers CORS policy. Techniques of cookie forging is not covered here.   Digital signature - digitally signed cookies assure that the cookies won't be forged, and cookie parser allows us to do this. H-MAC - H-MAC is a Key:Hash Message Authentication code, which is basically a secret key appended to the end of the server generated cookie. the secret is basically a hash made of some secret parameters known only to the server. This secret-key hash is checked on the server side with each request, and if the hash sent from the client does not match the hash generated by the server the request is denied and discarded. Digitally signing with cookie parser  Creation of a secret key to use with signed cookies: To assign a key to be used with cookies, you need to pass the secret hash string to the cookie-parser creation function. e.g: express().use(cookieParser('my super secret sign key')) Reading signed cookies: If the request contains signed cookies you can access them on the request object under the signedCookies object as show below: request.signedCookies.name Creating a signed cookie to send with the response: To create a signed cookies, you can pass an options object to the response.cookie(key,value,{options...}) function as shown below: res.cookie('name', 'foo', { signed: true }); example for the combination of all 3: .use(cookieParser('my super secret sign key')) .use('/toggle', function (req, res) {    if (req.signedCookies.name) {       res.clearCookie('name');       res.end('name cookie cleared! Was:' + req.signedCookies.name);    }    else {       res.cookie('name', 'foo', { signed: true });       res.end('name cookie set!');    } }) .listen(3000); Http only cookies  By default the browser gets access to cookies using javascript via the document.cookie object, this makes cookies vulnerable to Cross-Side-Scripting (XSS), for this case there is a possibility to deny javascript access to cookies, using the HttpOnly flag, added to the Set-Cookie header. This can be done by providing the options object, to-the response object's cookie method, and defining the httpOnly attribute to true A little bit on XSS (Cross-side-Scripting) XSS is when some malicious user manages to inject JavaScript into your web-site content, it allows that JavaScript to read cookies that might contain sensitive information for the currently logged in user and ship it to a malicious web site. A little about HTTPS: ​​​​​​​
Mostrar menos
Mostrar resumen completo Ocultar resumen completo