Table of Contents for
Node.js 8 the Right Way

Version ebook / Retour

Cover image for bash Cookbook, 2nd Edition Node.js 8 the Right Way by Jim Wilson Published by Pragmatic Bookshelf, 2018
  1. Title Page
  2. Node.js 8 the Right Way
  3. Node.js 8 the Right Way
  4. Node.js 8 the Right Way
  5. Node.js 8 the Right Way
  6.  Acknowledgments
  7.  Preface
  8. Why Node.js the Right Way?
  9. What’s in This Book
  10. What This Book Is Not
  11. Code Examples and Conventions
  12. Online Resources
  13. Part I. Getting Up to Speed on Node.js 8
  14. 1. Getting Started
  15. Thinking Beyond the web
  16. Node.js’s Niche
  17. How Node.js Applications Work
  18. Aspects of Node.js Development
  19. Installing Node.js
  20. 2. Wrangling the File System
  21. Programming for the Node.js Event Loop
  22. Spawning a Child Process
  23. Capturing Data from an EventEmitter
  24. Reading and Writing Files Asynchronously
  25. The Two Phases of a Node.js Program
  26. Wrapping Up
  27. 3. Networking with Sockets
  28. Listening for Socket Connections
  29. Implementing a Messaging Protocol
  30. Creating Socket Client Connections
  31. Testing Network Application Functionality
  32. Extending Core Classes in Custom Modules
  33. Developing Unit Tests with Mocha
  34. Wrapping Up
  35. 4. Connecting Robust Microservices
  36. Installing ØMQ
  37. Publishing and Subscribing to Messages
  38. Responding to Requests
  39. Routing and Dealing Messages
  40. Clustering Node.js Processes
  41. Pushing and Pulling Messages
  42. Wrapping Up
  43. Node.js 8 the Right Way
  44. Part II. Working with Data
  45. 5. Transforming Data and Testing Continuously
  46. Procuring External Data
  47. Behavior-Driven Development with Mocha and Chai
  48. Extracting Data from XML with Cheerio
  49. Processing Data Files Sequentially
  50. Debugging Tests with Chrome DevTools
  51. Wrapping Up
  52. 6. Commanding Databases
  53. Introducing Elasticsearch
  54. Creating a Command-Line Program in Node.js with Commander
  55. Using request to Fetch JSON over HTTP
  56. Shaping JSON with jq
  57. Inserting Elasticsearch Documents in Bulk
  58. Implementing an Elasticsearch Query Command
  59. Wrapping Up
  60. Node.js 8 the Right Way
  61. Part III. Creating an Application from the Ground Up
  62. 7. Developing RESTful Web Services
  63. Advantages of Express
  64. Serving APIs with Express
  65. Writing Modular Express Services
  66. Keeping Services Running with nodemon
  67. Adding Search APIs
  68. Simplifying Code Flows with Promises
  69. Manipulating Documents RESTfully
  70. Emulating Synchronous Style with async and await
  71. Providing an Async Handler Function to Express
  72. Wrapping Up
  73. 8. Creating a Beautiful User Experience
  74. Getting Started with webpack
  75. Generating Your First webpack Bundle
  76. Sprucing Up Your UI with Bootstrap
  77. Bringing in Bootstrap JavaScript and jQuery
  78. Transpiling with TypeScript
  79. Templating HTML with Handlebars
  80. Implementing hashChange Navigation
  81. Listing Objects in a View
  82. Saving Data with a Form
  83. Wrapping Up
  84. 9. Fortifying Your Application
  85. Setting Up the Initial Project
  86. Managing User Sessions in Express
  87. Adding Authentication UI Elements
  88. Setting Up Passport
  89. Authenticating with Facebook, Twitter, and Google
  90. Composing an Express Router
  91. Bringing in the Book Bundle UI
  92. Serving in Production
  93. Wrapping Up
  94. Node.js 8 the Right Way
  95. 10. BONUS: Developing Flows with Node-RED
  96. Setting Up Node-RED
  97. Securing Node-RED
  98. Developing a Node-RED Flow
  99. Creating HTTP APIs with Node-RED
  100. Handling Errors in Node-RED Flows
  101. Wrapping Up
  102. A1. Setting Up Angular
  103. A2. Setting Up React
  104. Node.js 8 the Right Way

Capturing Data from an EventEmitter

EventEmitter is a very important class in Node.js. It provides a channel for events to be dispatched and listeners to be notified. Many objects you’ll encounter in Node.js inherit from EventEmitter, like the Streams we saw in the last section.

Now let’s modify our previous program to capture the child process’s output by listening for events on the stream. Open an editor to the watcher-spawn.js file from the previous section, then find the call to fs.watch(). Replace it with this:

 fs.watch(filename, () => {
 const​ ls = spawn(​'ls'​, [​'-l'​, ​'-h'​, filename]);
 let​ output = ​''​;
 
  ls.stdout.on(​'data'​, chunk => output += chunk);
 
  ls.on(​'close'​, () => {
 const​ parts = output.split(​/​​\s​​+/​);
  console.log([parts[0], parts[4], parts[8]]);
  });
 });

Save this updated file as watcher-spawn-parse.js. Run it as usual, then touch the target file in a separate terminal. You should see output something like this:

 $ ​​node​​ ​​watcher-spawn-parse.js​​ ​​target.txt
 Now watching target.txt for changes...
 [ '-rw-rw-r--', '0', 'target.txt' ]

The new callback starts out the same as before, creating a child process and assigning it to a variable called ls. It also creates an output variable, which will buffer the output coming from the child process.

Notice the output variable declared with the keyword let. Like const, let declares a variable, but one that could be assigned a value more than once. Generally speaking, you should use const to declare your variables unless you know that the value should be able to change at runtime.

Next we add event listeners. An event listener is a callback function that is invoked when an event of a specified type is dispatched. Since the Stream class inherits from EventEmitter, we can listen for events from the child process’s standard output stream:

 ls.stdout.on(​'data'​, chunk => output += chunk);

A lot is going on in this single line of code, so let’s break it down.

Notice that the arrow function takes a parameter called chunk. When an arrow function takes exactly one parameter, like this one, you can omit the parentheses around the param.

The on() method adds a listener for the specified event type. We listen for data events because we’re interested in data coming out of the stream.

Events can send along extra information, which arrives in the form of parameters to the callbacks. data events in particular pass along a Buffer object. Each time we get a chunk of data, we append it to our output.

A Buffer is Node.js’s way of representing binary data.[19] It points to a blob of memory allocated by Node.js’s native core, outside of the JavaScript engine. Buffers can’t be resized and they require encoding and decoding to convert to and from JavaScript strings.

Any time you add a non-string to a string in JavaScript (like we’re doing here with chunk), the runtime will implicitly call the object’s toString() method. For a Buffer, this means copying the content into Node.js’s heap using the default encoding (UTF-8).

Shuttling data in this way can be a slow operation, relatively speaking. If you can, it’s often better to work with Buffers directly, but strings are more convenient. For this tiny amount of data the impact of conversion is small, but it’s something to keep in mind as you work more with Buffers.

Like Stream, the ChildProcess class extends EventEmitter, so we can add listeners to it, as well.

 ls.on(​'close'​, () => {
 const​ parts = output.split(​/​​\s​​+/​);
  console.log([parts[0], parts[4], parts[8]]);
 });

After a child process has exited and all its streams have been flushed, it emits a close event. When the callback printed here is invoked, we parse the output data by splitting on sequences of one or more whitespace characters (using the regular expression /\s+/). Finally, we use console.log to report on the first, fifth, and ninth fields (indexes 0, 4, and 8), which correspond to the permissions, size, and filename, respectively.

We’ve seen a lot of Node.js’s features in this small problem space of file-watching. You now know how to use key Node.js classes, including EventEmitter, Stream, ChildProcess, and Buffer. You also have firsthand experience writing asynchronous callback functions and coding for the event loop.

Let’s expand on these concepts in the next phase of our filesystem journey: reading and writing files.