Table of Contents for
Node.js 8 the Right Way

Version ebook / Retour

Cover image for bash Cookbook, 2nd Edition Node.js 8 the Right Way by Jim Wilson Published by Pragmatic Bookshelf, 2018
  1. Title Page
  2. Node.js 8 the Right Way
  3. Node.js 8 the Right Way
  4. Node.js 8 the Right Way
  5. Node.js 8 the Right Way
  6.  Acknowledgments
  7.  Preface
  8. Why Node.js the Right Way?
  9. What’s in This Book
  10. What This Book Is Not
  11. Code Examples and Conventions
  12. Online Resources
  13. Part I. Getting Up to Speed on Node.js 8
  14. 1. Getting Started
  15. Thinking Beyond the web
  16. Node.js’s Niche
  17. How Node.js Applications Work
  18. Aspects of Node.js Development
  19. Installing Node.js
  20. 2. Wrangling the File System
  21. Programming for the Node.js Event Loop
  22. Spawning a Child Process
  23. Capturing Data from an EventEmitter
  24. Reading and Writing Files Asynchronously
  25. The Two Phases of a Node.js Program
  26. Wrapping Up
  27. 3. Networking with Sockets
  28. Listening for Socket Connections
  29. Implementing a Messaging Protocol
  30. Creating Socket Client Connections
  31. Testing Network Application Functionality
  32. Extending Core Classes in Custom Modules
  33. Developing Unit Tests with Mocha
  34. Wrapping Up
  35. 4. Connecting Robust Microservices
  36. Installing ØMQ
  37. Publishing and Subscribing to Messages
  38. Responding to Requests
  39. Routing and Dealing Messages
  40. Clustering Node.js Processes
  41. Pushing and Pulling Messages
  42. Wrapping Up
  43. Node.js 8 the Right Way
  44. Part II. Working with Data
  45. 5. Transforming Data and Testing Continuously
  46. Procuring External Data
  47. Behavior-Driven Development with Mocha and Chai
  48. Extracting Data from XML with Cheerio
  49. Processing Data Files Sequentially
  50. Debugging Tests with Chrome DevTools
  51. Wrapping Up
  52. 6. Commanding Databases
  53. Introducing Elasticsearch
  54. Creating a Command-Line Program in Node.js with Commander
  55. Using request to Fetch JSON over HTTP
  56. Shaping JSON with jq
  57. Inserting Elasticsearch Documents in Bulk
  58. Implementing an Elasticsearch Query Command
  59. Wrapping Up
  60. Node.js 8 the Right Way
  61. Part III. Creating an Application from the Ground Up
  62. 7. Developing RESTful Web Services
  63. Advantages of Express
  64. Serving APIs with Express
  65. Writing Modular Express Services
  66. Keeping Services Running with nodemon
  67. Adding Search APIs
  68. Simplifying Code Flows with Promises
  69. Manipulating Documents RESTfully
  70. Emulating Synchronous Style with async and await
  71. Providing an Async Handler Function to Express
  72. Wrapping Up
  73. 8. Creating a Beautiful User Experience
  74. Getting Started with webpack
  75. Generating Your First webpack Bundle
  76. Sprucing Up Your UI with Bootstrap
  77. Bringing in Bootstrap JavaScript and jQuery
  78. Transpiling with TypeScript
  79. Templating HTML with Handlebars
  80. Implementing hashChange Navigation
  81. Listing Objects in a View
  82. Saving Data with a Form
  83. Wrapping Up
  84. 9. Fortifying Your Application
  85. Setting Up the Initial Project
  86. Managing User Sessions in Express
  87. Adding Authentication UI Elements
  88. Setting Up Passport
  89. Authenticating with Facebook, Twitter, and Google
  90. Composing an Express Router
  91. Bringing in the Book Bundle UI
  92. Serving in Production
  93. Wrapping Up
  94. Node.js 8 the Right Way
  95. 10. BONUS: Developing Flows with Node-RED
  96. Setting Up Node-RED
  97. Securing Node-RED
  98. Developing a Node-RED Flow
  99. Creating HTTP APIs with Node-RED
  100. Handling Errors in Node-RED Flows
  101. Wrapping Up
  102. A1. Setting Up Angular
  103. A2. Setting Up React
  104. Node.js 8 the Right Way

Reading and Writing Files Asynchronously

Earlier in this chapter, we wrote a series of Node.js programs that could watch files for changes. Now let’s explore Node.js’s methods for reading and writing files. Along the way we’ll see two common error-handling patterns in Node.js: error events on EventEmitters and err callback arguments.

There are a few approaches to reading and writing files in Node. The simplest is to read in or write out the entire file at once. This technique works well for small files. Other approaches read and write by creating Streams or staging content in a Buffer. Here’s an example of the whole-file-at-once approach:

 'use strict'​;
 const​ fs = require(​'fs'​);
 fs.readFile(​'target.txt'​, (err, data) => {
 if​ (err) {
 throw​ err;
  }
  console.log(data.toString());
 });

Save this file as read-simple.js and run it with node:

 $ ​​node​​ ​​read-simple.js

You’ll see the contents of target.txt echoed to the command line. If the file is empty, all you’ll see is a blank line.

Notice how the first parameter to the readFile() callback handler is err. If readFile is successful, then err will be null. Otherwise the err parameter will contain an Error object. This is a common error-reporting pattern in Node.js, especially for built-in modules. In our example’s case, we throw the error if there was one. Recall that an uncaught exception in Node.js will halt the program by escaping the event loop.

The second parameter to our callback, data, is a Buffer—the same kind that was passed to our various callbacks in previous sections.

Writing a file using the whole-file approach is similar. Here’s an example:

 'use strict'​;
 const​ fs = require(​'fs'​);
 fs.writeFile(​'target.txt'​, ​'hello world'​, (err) => {
 if​ (err) {
 throw​ err;
  }
  console.log(​'File saved!'​);
 });

This program writes hello world to target.txt (creating it if it doesn’t exist, or overwriting it if it does). If for any reason the file can’t be written, then the err parameter will contain an Error object.

Creating Read and Write Streams

You create a read stream or a write stream by using fs.createReadStream() and fs.createWriteStream(), respectively. For example, here’s a very short program called cat.js. It uses a file stream to pipe a file’s data to standard output:

 #!/usr/bin/env node
 'use strict'​;
 require(​'fs'​).createReadStream(process.argv[2]).pipe(process.stdout);

Because the first line starts with #!, you can execute this program directly in Unix-like systems. You don’t need to pass it into the node program (although you still can).

Use chmod to make it executable:

 $ ​​chmod​​ ​​+x​​ ​​cat.js

Then, to run it, send the name of the chosen file as an additional argument:

 $ ​​./cat.js​​ ​​target.txt
 hello world

The code in cat.js does not bother assigning the fs module to a variable. The require() function returns a module object, so we can call methods on it directly.

You can also listen for data events from the file stream instead of calling pipe(). The following program called read-stream.js does this:

 'use strict'​;
 require(​'fs'​).createReadStream(process.argv[2])
  .on(​'data'​, chunk => process.stdout.write(chunk))
  .on(​'error'​, err => process.stderr.write(​`ERROR: ​${err.message}​\n`​));

Here we use process.stdout.write() to echo data, rather than console.log. The incoming data chunks already contain any newline characters from the input file. We don’t need the extra newline that console.log would add.

Conveniently, the return value of on() is the same emitter object. We take advantage of this fact to chain our handlers, setting up one right after the other.

When working with an EventEmitter, the way to handle errors is to listen for error events. Let’s trigger an error to see what happens. Run the program, but specify a file that doesn’t exist:

 $ ​​node​​ ​​read-stream.js​​ ​​no-such-file
 ERROR: ENOENT: no such file or directory, open 'no-such-file'

Since we’re listening for error events, Node.js invokes our handler (and then proceeds to exit normally). If you don’t listen for error events, but one happens anyway, Node.js will throw an exception. And as we saw before, an uncaught exception will cause the process to terminate.

Blocking the Event Loop with Synchronous File Access

The file-access methods we’ve discussed in this chapter so far are asynchronous. They perform their I/O duties—waiting as necessary—completely in the background, only to invoke callbacks later. This is by far the preferred way to do I/O in Node.

Even so, many of the methods in the fs module have synchronous versions, as well. These end in *Sync, like readFileSync, for example. Doing synchronous file access might look familiar to you if you haven’t done a lot of async development in the past. However, it comes at a substantial cost.

When you use the *Sync methods, the Node.js process will block until the I/O finishes. This means Node.js won’t execute any other code, won’t trigger any callbacks, won’t process any events, won’t accept any connections—nothing. It’ll just sit there indefinitely waiting for the operation to complete.

However, synchronous methods are simpler to use since they lack the callback step. They either return successfully or throw an exception, without the need for a callback function. There actually are cases where this style of access is OK; we’ll discuss them in the next section.

Here’s an example of how to read a file using the readFileSync() method:

 const​ fs = require(​'fs'​);
 const​ data = fs.readFileSync(​'target.txt'​);
 process.stdout.write(data.toString());

The return value of readFileSync() is a Buffer—the same as the parameter passed to callbacks of the asynchronous readFile() method we saw before.

Performing Other File-System Operations

Node.js’s fs module has many other methods that map nicely onto POSIX conventions. (POSIX is a family of standards for interoperability between operating systems—including filesystem utilities.)[20] To name a few examples, you can copy() files and unlink() (delete) them. You can use chmod() to change permissions and mkdir() to create directories.

These functions rely on the same kinds of callback parameters we’ve used in this chapter. They’re all asynchronous by default, but many come with equivalent *Sync versions.