Table of Contents for
Node.js 8 the Right Way

Version ebook / Retour

Cover image for bash Cookbook, 2nd Edition Node.js 8 the Right Way by Jim Wilson Published by Pragmatic Bookshelf, 2018
  1. Title Page
  2. Node.js 8 the Right Way
  3. Node.js 8 the Right Way
  4. Node.js 8 the Right Way
  5. Node.js 8 the Right Way
  6.  Acknowledgments
  7.  Preface
  8. Why Node.js the Right Way?
  9. What’s in This Book
  10. What This Book Is Not
  11. Code Examples and Conventions
  12. Online Resources
  13. Part I. Getting Up to Speed on Node.js 8
  14. 1. Getting Started
  15. Thinking Beyond the web
  16. Node.js’s Niche
  17. How Node.js Applications Work
  18. Aspects of Node.js Development
  19. Installing Node.js
  20. 2. Wrangling the File System
  21. Programming for the Node.js Event Loop
  22. Spawning a Child Process
  23. Capturing Data from an EventEmitter
  24. Reading and Writing Files Asynchronously
  25. The Two Phases of a Node.js Program
  26. Wrapping Up
  27. 3. Networking with Sockets
  28. Listening for Socket Connections
  29. Implementing a Messaging Protocol
  30. Creating Socket Client Connections
  31. Testing Network Application Functionality
  32. Extending Core Classes in Custom Modules
  33. Developing Unit Tests with Mocha
  34. Wrapping Up
  35. 4. Connecting Robust Microservices
  36. Installing ØMQ
  37. Publishing and Subscribing to Messages
  38. Responding to Requests
  39. Routing and Dealing Messages
  40. Clustering Node.js Processes
  41. Pushing and Pulling Messages
  42. Wrapping Up
  43. Node.js 8 the Right Way
  44. Part II. Working with Data
  45. 5. Transforming Data and Testing Continuously
  46. Procuring External Data
  47. Behavior-Driven Development with Mocha and Chai
  48. Extracting Data from XML with Cheerio
  49. Processing Data Files Sequentially
  50. Debugging Tests with Chrome DevTools
  51. Wrapping Up
  52. 6. Commanding Databases
  53. Introducing Elasticsearch
  54. Creating a Command-Line Program in Node.js with Commander
  55. Using request to Fetch JSON over HTTP
  56. Shaping JSON with jq
  57. Inserting Elasticsearch Documents in Bulk
  58. Implementing an Elasticsearch Query Command
  59. Wrapping Up
  60. Node.js 8 the Right Way
  61. Part III. Creating an Application from the Ground Up
  62. 7. Developing RESTful Web Services
  63. Advantages of Express
  64. Serving APIs with Express
  65. Writing Modular Express Services
  66. Keeping Services Running with nodemon
  67. Adding Search APIs
  68. Simplifying Code Flows with Promises
  69. Manipulating Documents RESTfully
  70. Emulating Synchronous Style with async and await
  71. Providing an Async Handler Function to Express
  72. Wrapping Up
  73. 8. Creating a Beautiful User Experience
  74. Getting Started with webpack
  75. Generating Your First webpack Bundle
  76. Sprucing Up Your UI with Bootstrap
  77. Bringing in Bootstrap JavaScript and jQuery
  78. Transpiling with TypeScript
  79. Templating HTML with Handlebars
  80. Implementing hashChange Navigation
  81. Listing Objects in a View
  82. Saving Data with a Form
  83. Wrapping Up
  84. 9. Fortifying Your Application
  85. Setting Up the Initial Project
  86. Managing User Sessions in Express
  87. Adding Authentication UI Elements
  88. Setting Up Passport
  89. Authenticating with Facebook, Twitter, and Google
  90. Composing an Express Router
  91. Bringing in the Book Bundle UI
  92. Serving in Production
  93. Wrapping Up
  94. Node.js 8 the Right Way
  95. 10. BONUS: Developing Flows with Node-RED
  96. Setting Up Node-RED
  97. Securing Node-RED
  98. Developing a Node-RED Flow
  99. Creating HTTP APIs with Node-RED
  100. Handling Errors in Node-RED Flows
  101. Wrapping Up
  102. A1. Setting Up Angular
  103. A2. Setting Up React
  104. Node.js 8 the Right Way

Extending Core Classes in Custom Modules

The Node.js program we made in the last section exposed a flaw in our client code; namely, that it doesn’t buffer its inputs. Any message that arrives as multiple data events will crash it.

So really the client program has two jobs to do. One is to buffer incoming data into messages. The other is to handle each message when it arrives.

Rather than cramming both of these jobs into one Node.js program, the right thing to do is to turn at least one of them into a Node.js module. We’ll create a module that handles the input-buffering piece so that the main program can reliably get full messages. Along the way, we’ll need to talk about custom modules and extending core classes in Node.

Extending EventEmitter

To relieve the client program from the danger of split JSON messages, we’ll implement an LDJ buffering client module. Then we’ll incorporate it into the network-watcher client.

Inheritance in Node

First let’s have a look at how Node.js does inheritance. The following code sets up LDJClient to inherit from EventEmitter.

 const​ EventEmitter = require(​'events'​).EventEmitter;
 class​ LDJClient ​extends​ EventEmitter {
 constructor​(stream) {
 super​();
  }
 }

LDJClient is a class, which means other code should call new LDJClient(stream) to get an instance. The stream parameter is an object that emits data events, such as a Socket connection.

Inside the constructor function, we first call super to invoke EventEmitter’s own constructor function. Whenever you’re implementing a class that extends another class, you should start with calling super, with the appropriate constructor arguments for it.

You might be interested to know that under the hood, JavaScript uses prototypal inheritance to establish the relationship between LDJClient and EventEmitter. Prototypal inheritance is powerful, and can be used for more than just classes, but this usage is increasingly rare. Code to use ‘LDJClient‘ might look like this:

 const​ client = ​new​ LDJClient(networkStream);
 client.on(​'message'​, message => {
 // Take action for this message.
 });

The class hierarchy is now in place, but we haven’t implemented anything to emit message events. Let’s look at this next, and talk about buffering data events in Node.

Buffering Data Events

It’s time to use the stream parameter in the LDJClient to retrieve and buffer input. The goal is to take the incoming raw data from the stream and convert it into message events containing the parsed message objects.

Take a look at the following updated constructor. It appends incoming data chunks to a running buffer string and scans for line endings (which should be JSON message boundaries).

 constructor​(stream) {
 super​();
 let​ buffer = ​''​;
  stream.on(​'data'​, data => {
  buffer += data;
 let​ boundary = buffer.indexOf(​'​​\​​n'​);
 while​ (boundary !== -1) {
 const​ input = buffer.substring(0, boundary);
  buffer = buffer.substring(boundary + 1);
 this​.emit(​'message'​, JSON.parse(input));
  boundary = buffer.indexOf(​'​​\​​n'​);
  }
  });
 }

We start out by calling super, just like before, and then set up a string variable called buffer to capture incoming data. Next, we use stream.on to handle data events.

The code inside the data event handler is dense, but it’s not fancy. We append raw data to the end of the buffer and then look for completed messages from the front. Each message string is sent through JSON.parse and finally emitted by the LDJClient as a message event via this.emit.

At this point, the problem we started with (handling split messages) is effectively solved. Whether ten messages come in on a single data event or only half of one does, they’ll all precipitate message events on the LDJClient instance.

Next we need to put this class into a Node.js module so our upstream client can use it.

Exporting Functionality in a Module

Let’s pull together the previous code samples and expose LDJClient as a module. Start by creating a directory called lib. You could name it something else, but there is a strong convention in the Node.js community to put supporting code in the lib directory.

Next, open your text editor and insert the following:

 'use strict'​;
 const​ EventEmitter = require(​'events'​).EventEmitter;
 class​ LDJClient ​extends​ EventEmitter {
 constructor​(stream) {
 super​();
 let​ buffer = ​''​;
  stream.on(​'data'​, data => {
  buffer += data;
 let​ boundary = buffer.indexOf(​'​​\​​n'​);
 while​ (boundary !== -1) {
 const​ input = buffer.substring(0, boundary);
  buffer = buffer.substring(boundary + 1);
 this​.emit(​'message'​, JSON.parse(input));
  boundary = buffer.indexOf(​'​​\​​n'​);
  }
  });
  }
 
 static​ connect(stream) {
 return​ ​new​ LDJClient(stream);
  }
 }
 
 module.exports = LDJClient;

Save the file as lib/ldj-client.js. The code for this module is the combination of previous examples plus a static method—the new module.exports section at the end.

Inside the class definition, after the constructor, we’re adding a static method called connect. A static method is attached to the LDJClient class itself rather than applied to individual instances. The connect method is merely a convenience for consumers of the library so that they don’t have to use the new operator to create an instance of LDJClient.

In a Node.js module, the module.exports object is the bridge between the module code and the outside world. Any properties you set on exports will be available to upstream code that pulls in the module. In our case, we’re exporting the LDJClient class itself.

Code to use the LDJ module will look something like this:

 const​ LDJClient = require(​'./lib/ldj-client.js'​);
 const​ client = ​new​ LDJClient(networkStream);

Or, using the connect method, it could look like this:

 const​ client = require(​'./lib/ldj-client.js'​).connect(networkStream);

Notice that in both cases, the require function takes an actual path here, rather than the shorthand module names we’ve seen previously, like fs, net, and util. When a path is provided to require, Node.js will attempt to resolve the path relative to the current file.

Our module is done! Now let’s augment the network-watching client to use the module, to bring it all together.

Importing a Custom Node.js Module

It’s time to make use of our custom module. Let’s modify the client to use it rather than reading directly from the TCP stream.

Open a text editor and enter the following:

 'use strict'​;
 const​ netClient = require(​'net'​).connect({port: 60300});
 const​ ldjClient = require(​'./lib/ldj-client.js'​).connect(netClient);
 
 ldjClient.on(​'message'​, message => {
 if​ (message.type === ​'watching'​) {
  console.log(​`Now watching: ​${message.file}​`​);
  } ​else​ ​if​ (message.type === ​'changed'​) {
  console.log(​`File changed: ​${​new​ Date(message.timestamp)}​`​);
  } ​else​ {
 throw​ Error(​`Unrecognized message type: ​${message.type}​`​);
  }
 });

Save this file as net-watcher-ldj-client.js. It’s similar to our net-watcher-json-client from Creating Socket Client Connections. The major difference is that instead of sending data buffers directly to JSON.parse, this program relies on the ldj-client module to produce message events.

To make sure it solves the split-message problem, let’s run the test service:

 $ ​​node​​ ​​test-json-service.js
 Test server listening for subscribers...

Then, in a different terminal, use the new client to connect to it:

 $ ​​node​​ ​​net-watcher-ldj-client.js
 File changed: Tue Jan 26 2016 05:54:59 GMT-0500 (EST)

Success! You now have a server and client that use a custom message format to reliably communicate. To round out this chapter, we’ll bring in a popular testing framework called Mocha and use it to orchestrate our unit tests.