Categories
Node.js Tips

Node.js Tips — Promises, CSV to JSON, Watching Files

Like any kind of apps, there are difficult issues to solve when we write Node apps.

In this article, we’ll look at some solutions to common problems when writing Node apps.

Handle Errors Thrown by require() in Node.js

We can use try-catch to catch errors thrown by require .

For instance, we can write:

try {
  require('./app.js');
}
catch (e) {
  console.log(e);
}

We just call require in try then we can catch the error in catch .

How to use async/await with Streams

We can use async and await with streams by creating a promise with the stream with the Promise constructor.

For instance, we can write:

const fd = fs.createReadStream('/foo.txt');
const hash = crypto.createHash('sha1');
hash.setEncoding('hex');
fd.pipe(hash);

const readHash = new Promise((resolve, reject) => {
  hash.on('end', () => resolve(hash.read()));
  fd.on('error', reject);
});

Then we can use readHash in an async function by writing:

`(async function() {
  const sha1 = await` readHash`;
  console.log(sha1);
}());`

Promise Retry Design Patterns

We can retry a promise by delaying the promise.

For instance, we can write:

const retry = (fn, retries=3, err=null) => {
  if (!retries) {
    return Promise.reject(err);
  }
  return fn().catch(err => {
    return retry(fn, (retries - 1), err);
  });
}

fn is a function that returns a promise.

We retry as long as we haven’t exhausted the retries.

If the promise rejected by fn is rejected, then we call catch by passing in a callback to call retry with retries subtracted by 1.

We do that until retries reaches 0, then the retries are exhausted.

We Don’t Need .catch(err => console.error(err)) When Writing Promise Code

We don’t need to write .catch(err => console.error(err)) since promise rejections are already logged by Node.js.

We would know that an error happened even if we don’t have this line.

How to convert CSV to JSON in Node.js

We can use the csvtojson library to convert CSV to JSON.

To install it, we run:

npm install --save csvtojson

Then we can use it by writing:

const csv = require("csvtojson");

const readCsv = (csvFilePath) => {
  csv()
    .fromFile(csvFilePath)
    .then((jsonArrayObj) => {
       console.log(jsonArrayObj);
     })
}

We can read small files by passing in the csvFilePath to the fromFile method.

It returns a promise.

We get the converted result in the jsonArrayObj parameter.

We can also read the content from a stream and then get the JSON.

For instance, we can write:

const readFromStream = (readableStream) => {
  csv()
    .fromStream(readableStream)
    .subscribe((jsonObj) => {
       console.log(jsonObj)
    })
}

We call the fromStream method to get data from a stream.

Then we call subscribe with a callback to read the object.

Also, we can use fromFile with async and await since it returns a promise:

const csvToJson = async (filePath) => {
  const jsonArray = await csv().fromFile(filePath);
}

It can also be used on the command line.

For example, we can run it directly by running:

npm install -g csvtojson
csvtojson ./csvFile.csv

Edit Node App Files on the Server Without Restarting Node.js and See the Latest Changes

We can use a package like Node-Supervisor to watch for file changes and restart the Node app automatically.

To install it, we run:

npm install supervisor -g

Then we can run our program with Node-Supervisor by running:

supervisor app.js

Likewise, we can use Nodemon to do the same thing.

We can install it by running:

npm install -g nodemon

to install it globally.

Or we can install it by running:

npm install --save-dev nodemon

to install it as a project’s dev dependency.

Then we can run:

nodemon app.js

to run our app.

Chain and Share Prior Results with Promises

To share prior results with promises, we can chain them with then .

For instance, we can write:

makeRequest(url1, data1)
  .then((result1) => {
    return makeRequest(url2, data2);
  })
  .then((result2) => {
    return makeRequest(url3, data3);
  })
  .then((result3) => {
     // ...
  });

We get the resolved results from the previous promise in the then callbacks.

To make this shorter, we can use async and await to do the same thing:

const makeRequests = async () => {
  //...
  const r1 = await makeRequest(url1, data1);
  const r2 = await makeRequest(url2, data2);
  const r3 = await makeRequest(url3, data3);
  return someResult;
}

Conclusion

We can put streams in promises.

Also, we can watch for code file changes and restart our Node app with various programs.

The csvtojson library lets us convert CSV to JSON.

Promises can be retried by invoking it again.

Categories
Node.js Tips

Node.js Tips — Modules, Hashes, Favicons, Nested Routers

Like any kind of apps, there are difficult issues to solve when we write Node apps.

In this article, we’ll look at some solutions to common problems when writing Node apps.

Generate Random SHA1 Hash to Use as ID in Node.js

To generate a random SHA1 hash, we can use the crypto module.

For instance, we can write:

const crypto = require('crypto');
const currentDate = (new Date()).getTime().toString();
const random = Math.random().toString();
crypto.createHash('sha1').update(currentDate + random).digest('hex');

We use the getTime method of the Date instance to get the timestamp.

Then we create a random number.

Next, we create the hash by using the crypto module’s createHash and update methods to create the hash.

Then we convert that to hex with digest .

Set Custom Favicon in Express

We can serve a custom favicon by using the serve-favicon package.

For example, we can write:

const favicon = require('serve-favicon');
app.use(favicon(path.join(__dirname,'public','images','favicon.ico')));

The package is installed by running:

npm i serve-favicon

Then we can use the bundled middleware by passing in the path of the middleware.

Open Default Browser and Navigate to a Specific URL in a Node App

We can use the opn package to open the browser and go to a specific URL.

To install it, we run:

npm install opn

Then we can write:

const opn = require('opn');
opn('http://example.com');

to open the URL with the default browser.

We can also specify the browser by writing:

opn('http://example.com', { app: 'firefox' });

Then the URL will open in Firefox.

Express.js with Nested Router

We can nest routers Express.

For instance, we can write:

const express = require('express');
const app = express();

const userRouter = express.Router();
const itemRouter = express.Router({ mergeParams: true });

userRouter.use('/:userId/items', itemRouter);

userRouter.route('/')
  .get((req, res) `=>` {
    res.send('hello users');
  });

userRouter.route('/:userId')
  .get((req, res) `=>` {
    res.status(200)
  });

itemRouter.route('/')
  .get((req, res) `=>` {
    res.send('hello');
  });

itemRouter.route('/:itemId')
  .get((req, res) => {
    res.send(`${req.params.itemId} ${req.params.userId}`);
  });

app.use('/user', userRouter);

app.listen(3000);

We just pass the router items to the place we wish.

mergeParams is need on the itemRouter since we want to access parameters from the parent router.

Other than that, we just nest itemRouter in userRouter by writing:

userRouter.use('/:userId/items', itemRouter);

Convert a Binary NodeJS Buffer to JavaScript ArrayBuffer

We can convert a Node buffer to a JavaScript ArrayBuffer by writing:

const toArrayBuffer = (buffer) => {
  const arrayBuffer = new ArrayBuffer(buffer.length);
  const view = new Uint8Array(arrayBuffer);
  for (let i = 0; i < buffer.length; i++) {
    view[i] = buffer[i];
  }
  return arrayBuffer;
}

We just put each but of the buffer into the Uint8Array .

We can convert the other way by writing:

const toBuffer = (arrayBuffer) => {
  const buffer = Buffer.alloc(arrayBuffer.byteLength);
  const view = new Uint8Array(arrayBuffer);
  for (let i = 0; i < buffer.length; i++) {
    buffer[i] = view[i];
  }
  return buffer;
}

We create the Buffer with the alloc method.

Then we pass the arrayByffer to the Uint8Array constructor so we can loop through it.

Then we assign all the bits to the buffer array.

Create Directory When Writing To File In Node Apps

We can use the recursive option for the fs.mkdir method to create a directory before creating the file.

For instance, we can write:

fs.mkdir('/foo/bar/file', { recursive: true }, (err) => {
  if (err) {
    console.error(err);
  }
});

We create the file with path /foo/bar/file since we have the recursive option set to true .

Also, we can use the promise version by writing:

fs.promises.mkdir('/foo/bar/file', { recursive: true }).catch(console.error);

Load External JS File in with Access to Local Variables

We can load external JS files from another file if we create a module.

For instance, we can write:

module.js

module.exports = {
  foo(bar){
    //...
  }
}

app.js

const module = require('./module');
module.foo('bar');

We export the foo function by putting it in module.exports and import it with require .

Then we call it with module.foo('bar'); .

Conclusion

We can use the crypto module to generate a hash with a timestamp and a random string.

Also, we can convert between a Node buffer and an ArrayBuffer with loops.

The serve-favicon package lets us serve favicons in our Express apps.

We can create modules to export functions that can be run from other functions.

Express routers can be nested more than one level deep.

Categories
Node.js Tips

Node.js Tips — Format JSON, Remove Object with MongoDB, Parallel Execution

Like any kind of apps, there are difficult issues to solve when we write Node apps.

In this article, we’ll look at some solutions to common problems when writing Node apps.

Write Formatted JSON in Node.js

To print formatted JSON in Node apps, we can use the JSON.stringify method with some extra arguments.

For instance, we can write:

JSON.stringify(obj, null, 2)

to add 2 spaces for indentation.

Remove Object from Array with MongoDB

We can remove an object from an array with MongoDB by using the update method.

For instance, we can write:

db.collection.update(
  {'_id': ObjectId("5150a1199fac0e6910000002")},
  { $pull: { "items" : { id: 123 } } },
  false,
  true
);

We pass in the query, which is the ID of the document that has the array.

Then 2nd argument gets the item to update, which is what we want to remove from the items array.

$pull removes the item in the array.

The 3rd argument means we don’t want to do upsert.

The last argument means we process multiple documents.

console.log vs console.info in Node.js

We use the console.log method to log general messages.

console.info is specifically used for logging informational messages.

However, they pretty much do the same thing other than the name and their intended use.

Copy to Clipboard in Node.js

To copy something to the clipboard, we can run a shell command to copy something to the clipboard.

For instance, in Linux, we can write:

const exec = require('child_process').exec;

const getClipboard = (func) => {
  exec('/usr/bin/xclip -o -selection clipboard', (err, stdout, stderr) => {
    if (err || stderr) {
      throw new Error(stderr);
    }
    func(null, stdout);
  });
};

getClipboard((err, text) => {
  if (err) {
    console.error(err);
  }
  console.log(text);
});

We get the clipboard’s content and output it to standard out with the command.

Also, we can use the clipboardy package to read and write from the clipboard.

For instance, we can write:

const clipboardy = require('clipboardy');
clipboardy.writeSync('copy me');

to copy to the clipboard.

To paste from the clipboard, we can write:

const clipboardy = require('clipboardy');
`clipboardy.readSync();`

Coordinating Parallel Execution in Node.js

To coordinate parallel execution of code in a Noe app, we can use the async library.

For instance, we can write:

const async = require('async');
const fs = require('fs');
const A = (c) => { fs.readFile('file1', c) };
const B = (c) => { fs.readFile('file2', c) };
const C = (result) => {
  // get all files and use them
}

async.parallel([A, B], C);

We have 3 read file processes and we group the first 2 together so that we can run them together in parallel.

Then once they’re both done, we run function C .

Then we have full control of which ones to run in parallel and which one to run in series with the parallel method.

Require and Functions

If we want to call require to import a function.

Then we’ve to assign the function to module.exports .

For instance, if we have:

app/routes.js

module.exports = (app, passport) => {
  // ...
}

Then we can write:

require('./app/routes')(app, passport);

to call import the function and call it immediately.

async/await and ES6 yield with Generators

async and await are very closely related to generators.

They are just generators that always yield promises.

They’re compiled to generators with Babel.

async and await always use yield .

It’s used to unwrap the yielded values as promises and pass the resolved value to the async function.

We should use async and await for chaining promises since we can chain them as if the code is synchronous.

However, it can only return promises.

async and await is an abstraction built on top of generators to make working with them easier.

Get Data Out of a Node.js HTTP Get Request

We can make a GET request with the http module by using its get method.

For instance, we can write:

const http = require('http');

const options = {
  //...
};

http.get(options, (response) => {
  response.setEncoding('utf8')
  response.on('data', console.log)
  response.on('error', console.error)
})

response is a read stream, so we’ve to listen to the data event to get the data.

And we listen to the error event to get the error.

Conclusion

We can use the http module to make GET requests.

To remove an item from an array in a MongoDB document, we can use the update method with the $pull command.

The async module lets us deal with parallel and serial execution of functions.

Also, we can manipulate the clipboard by using shell commands or the 3rd party libraries.

async and await are abstraction on top of generators.

Categories
Node.js Tips

Node.js Tips — Timestamp, Cookies, Send File Responses and Fetch

Like any kind of apps, there are difficult issues to solve when we write Node apps.

In this article, we’ll look at some solutions to common problems when writing Node apps.

Fix ‘ReferenceError: fetch is not defined’ Error

The Fetch API isn’t implemented in Node.

If we want to use it, we can use the node-fetch library.

To install it, we run:

npm i node-fetch --save

Then we can use it by writing:

const fetch = require("node-fetch");

There’s also the cross-fetch library.

We can install it by running:

npm install --save cross-fetch

Then we can write:

import fetch from 'cross-fetch';

fetch('https//example.com')
  .then(res => {
    if (res.status >= 400) {
      throw new Error("error");
    }
  })

Avoid Long Nesting of Asynchronous Functions in Node.js

We can reorganize our nested async functions by rewriting it as multiple functions.

For example, we can write:

http.createServer((req, res) => {
  getSomeData(client, (someData) => {
    getMoreData(client, function(moreData) => {
         //
      });
   });
});

to:

const moreDataParser = (moreData) => {
   // date parsing logic
};

const dataParser = (data) => {
  getMoreData(client, moreDataParser);
};

const callback = (req, res) => {
  getSomeData(client, dataParser);
};

http.createServer(callback);

We move the anonymous functions into their own named functions so we can call them one by one.

Use of the “./bin/www” in Express 4.x

./bin/www is the file with the entry point of the Express app.

In the start script, we have:

"scripts": {
  "start": "node ./bin/www",
}

to start the app.

Get and Set a Single Cookie with Node.js HTTP Server

If we’re using the http module to create a server, we can parse a cookie by splitting the string .

For instance, we can write:

const http = require('http');

const parseCookies = (cookieStr) => {
  const list = {};
  const cookies = cookieStr && cookieStr.split(';');
  for (const cookie of cookies) {
    const [key, value] = cookie.split('=');
    list[key.trim()] = decodeURI(value);
  });
  return list;
}

http.createServer((request, response) => {
  const cookies = parseCookies(request.headers.cookie);
  //...
}).listen(8888);

We get the cookie string with request.headers.cookie .

Then we pass that into the parseCookie function.

It splits the cookie string by the semicolon.

Then we loop through them to split the parts by the = sign.

The left side is the key and the right is the value.

To set a cookie, we can write:

const http = require('http');

`http.createServer((request, response) => {
  ` response.writeHead(200, {
    'Set-Cookie': 'foo=bar',
    'Content-Type': 'text/plain'
  });
  response.end('hellon');`
}).listen(8888);`

We use writeHead to set headers.

'Set-Cookie' sets the response cookie.

'Content-Type' sets the content type of the response.

Then we use response.end to return the response body.

Get Request Path with the Express req Object

We can use the req.originalUrl property to get the request path of the Express req object.

Copy Folder Recursively

We can copy a folder recursively with the node-fs-extra package.

Its copy method lets us do the copying recursively.

For instance, we can write:

fs.copy('/tmp/oldDir', '/tmp/newDir', (err) => {
  if (err) {
    console.error(err);
  } else {
    console.log("success");
  }
});

The argument is the old directory path.

The 2nd is the new directory path.

The last argument is the callback that’s run when the copy operation ends.

err has the error.

Return the Current Timestamp with Moment.js

We can return the current timestamp in Moment.js with a few methods.

We can write:

const timestamp = moment();

or:

const timestamp = moment().format();

or:

const timestamp = moment().unix();

or:

`const timestamp =` moment().valueOf();

to get the timestamp.

We can also convert it to a Date instance and call getTime :

const timestamp = moment().toDate().getTime();

Fix the ‘TypeError: path must be absolute or specify root to res.sendFile’ Error

This error can be fixed if we pass in an absolute path to res.sendFile .

To do that, we write:

res.sendFile('index.html', { root: __dirname });

or:

const path = require('path');
res.sendFile(path.join(__dirname, '/index.html'));

Either way, we get the absolute path which is required by res.sendFile .

Conclusion

We’ve to parse cookies manually if we use http to create our server.

Fetch API isn’t included with Node’s standard library.

res.sendFile only takes an absolute path.

We can rewrite nested callbacks by reorganizing them into their own functions.

Moment returns the current timestamp with a few functions.

Categories
Node.js Tips

Node.js Tips — Streams, Scraping, and Promisifying Functions

Like any kind of apps, there are difficult issues to solve when we write Node apps.

In this article, we’ll look at some solutions to common problems when writing Node apps.

Converting a Buffer into a ReadableStream in Node.js

We can convert a buffer to a readable stream with the ReadableStreamBuffer constructor.

For instance, we can write:

const { ReadableStreamBuffer } = require('stream-buffers');
const readableStreamBuffer = new ReadableStreamBuffer({
  frequency: 10,
  chunkSize: 2048
});

readableStreamBuffer.put(aBuffer);

where aBuffer is the buffer we want to convert.

frequency is the frequency in which the chunks are pumped out.

The size can be changed in the constructor with the chunkSize property.

Scrape Web Pages in Real-Time with Node.js

We can use the cheerio library to scrape web pages in real-time.

To install it, we run:

npm install cheerio

Then we can use it by writing:

const cheerio = require('cheerio');
const $ = cheerio.load('<h1 class="title">Hello world</h1>');

$('h1.title').text('Hello James');
$('h1').addClass('welcome');

$.html();

We can get HTML’s content and manipulate it as we do with jQuery.

We can combine this with an HTTP client like Axios to get the HTML.

Then we can use Cheerio to parse its content.

For instance, we can write:

const axios = require('axios');
const cheerio = require('cheerio');

axios.get('https://example.com')
.then(({ data }) => {
  const $ = cheerio.load(data);
  const text = $('h1').text();
  console.log(text);
})

We make a GET request with Axios to a website.

Then we use cheerio to parse the data with cheerio.load .

Then we get the content of h1 with the text method.

Any selector can be used to get data.

Generate an MD5 file Hash in JavaScript

We can generate an MD5 hash with the crypto-js package.

To install it, we can run:

npm install crypto-js

Then we can write:

import MD5 from "crypto-js/md5";
const md5Hash = MD5("hello world");

to generate the hash.

Server-Side Browser Detection with Node.js

We can get the user-agent header to get the user agent string from the request.

To parse the string, we can use the ua-parser-js package.

To install it, we run:

npm install ua-parser-js

In our Express app, we can create our own middleware to check the user agent:

const UAParser = require('ua-parser-js');

const checkBrowser = (req, res, next) => {
  const parser = new UAParser();
  const ua = req.headers['user-agent'];
  const browserName = parser.setUA(ua).getBrowser().name;
  const fullBrowserVersion = parser.setUA(ua).getBrowser().version;

  console.log(browserName);
  console.log(fullBrowserVersion);
  next();
}

app.all(/*/, checkBrowser);

We get the user-agent header with req.headers[‘user-agent’] .

Then we use the UAParser constructor to parse the user agent string.

We call getBrowser to get the browser data.

name has the browser name and version has the version.

Best Way to Store Database Config in an Express App

We can store the config in a configuration file.

For instance, we can write:

const fs = require('fs');
const configPath = './config.json';
const configFile = fs.readFileSync(configPath, 'utf-8')
const parsedConfig = JSON.parse(configFile);
exports.storageConfig = parsedConfig;

We called readFileSync to read config.json .

Then we parse the data from the JSON string.

And we export the config in the last line.

Promisify Node’s child_process.exec and child_process.execFile Functions

We can convert the child_process exec and execFile methods with Bluebird.

For instance, we can write:

const util = require('util');
const exec = util.promisify(require('child_process').exec);

const listFiles = async () => {
  try {
    const { stdout, stderr } = await exec('ls');
    console.log('stdout:', stdout);
    console.log('stderr:', stderr);
  } catch (e) {
    console.error(e);
  }
}

listFiles();

We used the util library’s promisify method to convert the exec method to a promise.

Then we can call the promisified exec method with the ls command.

And then we get the full output with stdout and stderr .

stdout has the results. stderr has the error output.

Conclusion

We can convert a buffer into a readable stream.

We can scrape web pages with an HTTP client and cheerio.

crypto-js has the MD5 method to create an MD5 hash.

Methods from the child_process can be converted to promises.

We can parse the user agent string on the server-side and parse the browser version data.