Categories
Node.js Tips

Node.js Tips — Async Functions, Read Files from S3, and Closing Servers

Like any kind of apps, there are difficult issues to solve when we write Node apps.

In this article, we’ll look at some solutions to common problems when writing Node apps.

Avoid the async/await Inside of a new Promise() Constructor Antipattern

We should avoid the asyhc and await inside of the Promise constructor antipattern.

For instance, instead of writing:

const createPromise = () => {
  return new Promise(async (resolve, reject) => {
    const val = await Promise.resolve(100);
    //..
  });
}

We don’t have to have an async function inside a promise because we shouldn’t have promises inside the Promise constructor.

async functions already return promises, so we don’t need the Promise constructor to turn the code into a promise.

We just move the async function out of the Promise and use that as is.

Update All Clients Using Socket.io

We can update all clients with socket.io by using the emit method.

For instance, we can write:

socket.emit('message', 'hello');

We emit the message event to all clients with the content 'hello' .

We can also use the io.sockets.emit method to emit to all sockets.

For instance, we can write:

io.sockets.emit('`message`', 'hello');

And we can use the broadcast property to do the same thing.

It sends a message to everyone except for the socket that’s sending the event:

socket.broadcast.emit('`message`', 'hello');

Get Response from S3 getObject in Node.js

We can call the getObject method to get the item we want as follows:

const aws = require('aws-sdk');
const s3 = new aws.S3();

const getParams = {
  Bucket: 'abc',
  Key: 'abc.txt'
}

s3.getObject(getParams, (err, data) => {
  if (err) {
    return err;
  }

  const objectData = data.Body.toString('utf-8');
});

We use the aws.S3 constructor to create a new client.

Then we set the parameters for setting the bucket and key, which is the file path.

Next, we call getObject to get the file with the given path in that bucket.

err is the error object.

data is the content of the file.

We convert it to a string with toString and the correct encoding.

Properly Close the Express Server

We call close on the http server instance rather than the app instance.

For instance, we can write:

app.get('/foo', (req, res) => {
  res.redirect('/');
  setTimeout(() => {
    server.close();
  }, 3000)
});

const server = app.listen(3000);

We call close on server , which is the http server instance.

app.listen returns the http server instance.

Calling a JSON API with Node.js

We can use the http module’s get method to make a GET request to a server.

For instance, we can write:

const url = 'https://api.agify.io/?name=michael';

http.get(url, (res) => {
  let body = '';
  res.on('data', (chunk) => {
    body += chunk;
  });

  res.on('end', () => {
    const response = JSON.parse(body);
    console.log(response.name);
  });
})
.on('error', (e) => {
  console.log(e);
});

We called get by passing in a callback.

The callback listens to the data event, which gets the data.

We listen to the end event to parse the JSON when the stream is done emitting data.

In the end callback, we parse the JSON with JSON.parse .

We also listen to the error event to log any errors if they exist.

Selecting Fields for Documents Populated in MongoDB

We can call the populate method to populate the field that we want and return it.

For instance, we can write:

Model
.findOne({ _id: '...' })
.populate('age', 'name')

We get the item by the _id , populate the age and return the name .

Check if a Function is async

We can check if a function is async by using the Symbol.toStringTag property.

For instance, we can write:

asyncFn[Symbol.toStringTag] === 'AsyncFunction'

We can also check if it’s an instance of the AsyncFunction with instanceof .

For instance, we can write:

const AsyncFunction = (async () => {}).constructor;
const isAsyncFunc = asyncFn instanceof AsyncFunction;

We get the constructor of the async function with the constructor property.

Then we use the instanceof operator to do the check.

Conclusion

We shouldn’t put async functions in the Promise constructor.

We can read a file from the S3 bucket with the getObject method.

Also, we can make HTTP requests with the http module.

We can call close on the http server instance returned by app.listen .

There are a few ways to check if a function is async.

Categories
Node.js Tips

Node.js Tips — XML, MySQL, HTTP Requests, and Deleting Files

Like any kind of apps, there are difficult issues to solve when we write Node apps.

In this article, we’ll look at some solutions to common problems when writing Node apps.

Node.JS async.parallel

The async module is the parallel method to let us run async code in parallel.

For instance, we can write:

async.parallel({
  one(callback) {
    callback(null, 'foo');
  },
  two(callback) {
    callback(null, 'bar');
  }
}, (err, results) => {
  //...
});

We have 2 methods that take a callback in the object that we pass into async.parallel .

They both call callback with the error and result objects as the arguments.

Then we get both in the results object.

results would be:

{ one: 'foo', two: 'bar' }

The method names are used as the keys of the results.

Reading XML File in Node.js

We can read XML files in Node.js with the readFile method.

Then we can parse the content with the xml2json library.

For instance, we can write:

fs = require('fs');
const parser = require('xml2json');

fs.readFile('./data.xml', (err, data) => {
  const json = parser.toJson(data);
  console.log(json);
});

We call readFile with the path of the XML file.

Then we get the XML text with data .

Next we call parser.toJson to parse the XML string stored in data .

Node MySQL Escape LIKE Statement

We can escape the characters in a LIKE statement by writing:

mysql.format("SELECT * FROM persons WHERE name LIKE CONCAT('%', ?,  '%')", searchString)

We just concatenate the % and our searchString together.

Delete Files Older than an Hour

To get all the file older than an hour and delete them, we can use readdir to get the files in the directory.

Then we can use fs.stat to get the time and we can use that to compare with the current time to see if an hour has passed since the file has been created.

When the file is created an hour ago or earlier, we can use rimraf to delete it.

For instance, we can write:

const uploadsDir = path.join( __dirname, '/uploads');

fs.readdir(uploadsDir, (err, files) => {
  files.forEach((file, index) => {
    fs.stat(path.join(uploadsDir, file), (err, stat) => {
      let endTime, now;
      if (err) {
        return console.error(err);
      }
      now = new Date().getTime();
      endTime = new Date(stat.ctime).getTime() + 3600000;
      if (now > endTime) {
        return rimraf(path.join(uploadsDir, file), (err) => {
          if (err) {
            return console.error(err);
          }
          console.log('successfully deleted');
        });
      }
    });
  });
});

We read the folder with readdir .

Then we loop the files obtained in the callback.

Then we call fs.stat to get the file information.

We use the ctime property to get the time when the file is created.

Then we use getTime to turn it into a timestamp.

And we add 3600000 which is an hour in milliseconds.

Then if now > endTime is true , we know that more than an hour has passed since the file is created.

Then we can use rimraf to remove the file.

We use the full path.

Calling a Web Service using Node.js

We can call a web server in a Node app by making HTTP requests as we do on the client-side.

For instance, we can write:

const http = require('http');
const data = JSON.stringify({
  'id': '2'
});

const options = {
  host: 'host.com',
  port: '80',
  path: '/some/url',
  method: 'POST',
  headers: {
    'Content-Type': 'application/json;',
    'Content-Length': data.length
  }
};

const req = http.request(options, (res) => {
  let msg = '';

  res.setEncoding('utf8');
  res.on('data', (chunk) => {
    msg += chunk;
  });
  res.on('end', () => {
    console.log(JSON.parse(msg));
  });
});

req.write(data);
req.end();

We use the http.request method to make a request.

We call req.write to make the request with the body

The options object has the headers, hostname, and request method.

Once the request is finished, the callback is called and res has the stream with the response.

We listen to the data event to get the chunks of data and concatenated to the msg string.

Then we listen to the end event to parse the chunks that are retrieved.

Conclusion

async.parallel can run callbacks in parallel.

We can delete files with rimraf.

Also, we can make HTTP requests with the http module.

To parse XML to JSON we can use the xml2json package.

Categories
Node.js Tips

Node.js Tips — Mongoose Relationships, Body Parser, and Reading and Writing Files

Like any kind of apps, there are difficult issues to solve when we write Node apps.

In this article, we’ll look at some solutions to common problems when writing Node apps.

Disable Express BodyParser for File Uploads

We can bypass body-parser by setting the value of the body.parse('multipart/form-data') function to a different value.

For instance, we can write:

const express = require('express');
const bodyParser = express.bodyParser;

bodyParser.parse('multipart/form-data') = (req, options, next) => {
  next();
}

We set a new value to the body.parse('multipart/form-data') function to do nothing and just call next .

Then no parsing will be done when multipart/form-data data types are encountered.

A better way is to not use bodyParser directory.

Instead, we can just use it to parse the data type we want by writing:

const bodyParser = require("body-parser");
app.use(bodyParser.json());
app.use(bodyParser.urlencoded());

We call the json and urlencoded methods to only parse JSON and URL encoded data.

Node.js — Creating Relationships with Mongoose

We can create relationships with Mongoose by using the ref property.

For instance, we can write:

const Schema = mongoose.Schema;
const ObjectId = Schema.ObjectId;

const PersonSchema = new Schema({
  name : String
})

const AddressSchema = new Schema({
  street: String
  person: { type: ObjectId, ref: 'PersonSchema' }
})

In AddressSchema , we set the ref property to the string of the schema name for PersonSchema .

Then we can use it by writing:

const person = new PersonSchema({ name: 'james' });
person.save();

const address = new AddressSchema({
  phone: '123 A St.',
  person: person._id
});
address.save();

We create the person and address , which is related to person .

The person._id if the object ID for the person which we want to reference in address .

Then we can get the address when we query PersonSchema buy writing:

PersonSchema
  .findOne({})
  .populate('address')
  .exec((err, address) => {
    //...
  })

We query the PersonSchema with findOne , then call populate to get the address from the person.

Looping Through Files in a Folder Node.js

We can loop through files in a folder in Node apps by using the opendir method.

For instance, we can write:

const fs = require('fs')

const listFiles = async (path) => {
  const dir = await fs.promises.opendir(path);
  for await (const dirent of dir) {
    console.log(dirent.name);
  }
}

listFiles('.').catch(console.error)

We use the promise version of the opendir method.

The path is the path to the folder.

Then we use the for-await-of loop to read the entry’s name property to log their name.

Since an async function returns a promise, we can use catch to handle errors when we read the directory’s entries.

We can also read a directory’s entries synchronously.

For instance, we can write:

const fs = require('fs');

const dir = fs.opendirSync('.');
let dirent;
while ((dirent = dir.readSync()) !== null) {
  console.log(dirent.name)
}
dir.closeSync();

We call openDirSync to read the entries with the given path.

Then we use the while loop to read each entry with readSync .

Then we close the handle with closeSync .

Checking if writeFileSync Successfully Wrote the File

writeFileSync doesn’t return anything, so we won’t know whether the file is successfully written.

Instead, we can use the async version, which takes a callback that brings us the error if there’s an error.

For instance, we can write:

fs.exists(file, (exists) => {
  if (exists) {
    fs.writeFiles(file, content, 'utf-8', (err) => {
      if (err) {
        console.log("failed to save");
      } else {
        console.log("success");
      }
  } else {
    console.log('file not found');
  }
}

We check if the exists. If it does exist, then we write what we want to it.

file is the file path. content is the content that we want to write.

'utf-8' is the encoding.

err is the error that’s encountered when the file failed to save.

Otherwise, we don’t do anything.

Conclusion

We can use parts of the body-parser package to parse the request payload we want.

Mongoose can create schemas with relationships.

We can loop through folders with various methods.

Also, we can check if a file exists before we write to them.

writeFileSync doesn’t guarantee that file writing operations succeed.

Categories
Node.js Tips

Node.js Tips — Send Message, Adding End of Line, and S3 Uploads

Like any kind of apps, there are difficult issues to solve when we write Node apps.

In this article, we’ll look at some solutions to common problems when writing Node apps.

Send a Message to a Particular Client with Socket.io

We can send a message to a particular client with socket.io by using the emit method.

For instance, we can write:

io.to(socket.id).emit("event", data);

We call the io.to method with the ID of the client we want to send our message to.

Responding with a JSON object in Node.js

We can use the res.json method with Express to return JSON response.

For instance, we can write:

res.json({ foo: 'bar' });

to return the JSON object that we passed in.

Check if Node.js is Installed or Not

We can check if Node is installed by using the node -v command to display the Node version.

Select and Update Document by _id with node-mongodb-native

To select and update a document by its _id , we can use the mongo.ObjectID constructor to create the ID object.

For instance, we can write:

const mongo = require('mongodb');
const id = new mongo.ObjectID(theId);
collection.update({ '_id': id });

theId is the string of the ID.

Then we use that as the value of the '_id' property.

How to Append to New Line in Node.js

We can append a new line to a file in a Node app by using the open and write methods.

To add the new line, we use the os.EOL property to add the new line character.

This way, the new line will be inserted correctly with all platforms.

For instance, we can write:

const os = require("os");

const addNewLine = (text) => {
  fs.open('/path/to/file', 'a', 666, (e, id) => {
    fs.write(id, `${text}${os.EOL}`, null, 'utf8', () => {
      fs.close(id, () => {
       console.log('file is updated');
      });
    });
  });
}

We created the addNewLine method to open the file.

We open it with append permission as indicated by 'a' .

666 is the file permission which includes a read and writes.

Then we get the file ID in the callback so we can use it to update the file.

In the fs.write call, id is the file ID. The 2nd argument is text content.

os.EOL is the platform-agnostic new line constant.

'utf8' is the UTF-8 encoding.

Then callback is called when writing is done.

Then we clean up the filehandle with fs.close .

Uploading base64 Encoded Image to Amazon S3 via Node.js

To upload a base64 image with S3, we can convert it to the buffer.

Then we can call putObject to upload the file.

For instance, we can write:

const AWS = require('aws-sdk');
AWS.config.loadFromPath('./config.json');
const s3Bucket = new AWS.S3( { params: { Bucket: 'someBucket'} } );

const buffer = new Buffer(req.body.imageBinary.replace(/^data:image/w+;base64,/, ""),'base64');

const data = {
  Key: 123,
  Body: buffer,
  ContentEncoding: 'base64',
  ContentType: 'image/jpeg'
};

s3Bucket.putObject(data, (err, data) => {
  if (err) {
    console.log(err, data);
  } else {
    console.log('success');
  }
});

We firs get the credentials from the config.json file to authenticate for the AWS SDK.

Then we create a buffer from the base64 string.

We’ve to take out the data:image/w+;base64 part.

Then we create a data object with the Key , Body , ContentEncoding and ContentType properties.

Key is the path to the file.

Body has the content of the file.

ContentEncoding has the encoding.

ContentType has the data type.

Then we call putObject to upload the file.

config.json has:

{
  "accessKeyId":"xxxxxxxxxxxxxxxx",
  "secretAccessKey":"xxxxxxxxxxxxxx",
  "region":"us-east-1"
}

Stdout Buffer Issue Using Node child_process’s exec — maxBuffer Exceeded

If we get the ‘maxBuffer exceeded’ error, we can use set the maxBuffer option on exec to increase the buffer size.

For instance, we can write:

exec('ls', { maxBuffer: 1024 ** 2 }, (error, stdout, stderr) => {
  console.log(error, stdout);
});

We set the maxBuffer size in the object in the 2nd argument.

Conclusion

We can use the io.to method to send the message to a given client.

To send JSON response, we can use res.json .

We can update with ID.

To max buffer size of exec can be increased.

We use the os.EOL constant to add a platform-agnostic end of line character.

Categories
Node.js Tips

Node.js Tips —MongoDB Connections, Fetch, and Exports

Like any kind of apps, there are difficult issues to solve when we write Node apps.

In this article, we’ll look at some solutions to common problems when writing Node apps.

Retrieve Data from a ReadableStream Object Returned From the Fetch API

We can retrieve data from a ReadableStream object by calling conversion functions to convert it to the data type we want.

For instance, we write:

fetch('https://api.agify.io/?name=michael')
  .then(response => response.json())
  .then((data) => {
    console.log(data);
  });

We did the conversion to JSON with the response.json() call.

Then we can get the data in the next then callback.

Share Variables Between Files in Node.js

We can share variables by creating modules.

For instance, we can write:

module.js

const name = "foo";
exports.name = name;

app.js

const module = require('./module');
const name = module.name;

We export the name constant by setting it as a property of exports .

Then we import it with require in app.js .

And we can use it like any other property.

Using fs in Node.js with async / await

We can use async and await with fs methods if we convert them to promises first.

For instance, we can write:

const fs = require('fs');
const util = require('util');

const readdir = util.promisify(fs.readdir);

const read = async () => {
  try {
    const names = await readdir('/foo/bar/dir');
    console.log(names);
  } catch (err) {
    console.log(err);
  }
}

We convert the fs.readdir method, which reads a directory, to a promise version with util.promisify .

Then we can use it with async and await .

Mongoose and Multiple Database in Single Node.js Project

We can connect to multiple MongoDB databases in a single Node project.

For instance, we can write:

fooDbConnect.js

const mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/foo');
module.exports = exports = mongoose;

bazDbConnect.js

const mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/baz');
module.exports = exports = mongoose;

We connect to different databases with connect in each file and export the connections.

Then in db.js , we write:

const mongoose = require("./`fooDbConnect`");

to connect to the foo database.

Use of module.exports as a Constructor

We can export a constructor with module.exports .

For example, we can write:

person.js

function Person(name) {
  this.name= name;
}

Person.prototype.greet = function() {
  console.log(this.name);
};

module.exports = Person;

Then we can import it by writing:

app.js

const Person = require("./`person`");
const person = new Person("james");

We import the Person constructor in app.js and then invoked it in the last line.

Read a Text File Using Node.js

We can read a text file with the fs.readFile method.

For instance, we can write:

const fs = require('fs');

fs.readFile('./foo.txt', 'utf8', (err, data) => {
  if (err) {
    console.log(err);
  }
  console.log(data);
});

We pass in the file path as the first argument.

The encoding is the 2nd argument.

The callback that’s called when the file read operation is done is the 3rd argument.

data has the file and err has the error object is an error is encountered.

Reuse Connection to MongoDB Across Node Modules

We can export the MongoDB connection so that we can use them in other Node app modules.

For example, we can write:

dbConnection.js

const MongoClient = require( 'mongodb' ).MongoClient;
const url = "mongodb://localhost:27017";

let _db;

module.exports = {
  connect(callback) {
    MongoClient.connect( url, { useNewUrlParser: true }, (err, client) => {
      _db  = client.db('some_db');
      return callback(err, client);
    });
  },

  getDb() {
    return _db;
  }
};

We created the connect function to connect to the MongoDB database.

The connection object is set to a variable.

It takes a callback that we can use to get any errors is there is any.

Then we created the getDb function to get the MongoDB connection.

Then we can use connect.js by writing:

const dbConnection = require('dbConnection');

dbConnection.connect((err, client) => {
  if (err) {
    console.log(err);
  }
} );

We connect to the database in our file by importing the dbConnection module we created earlier.

Then we call connect with the callback we want.

We called it with err and client in dbConnection.js so that’s what we have as the parameter of the callback.

Conclusion

MongoDB connections can be shared between modules.

We’ve to convert the raw response from the fetch function to the data type we want before we can use the data.

fs methods can be converted to promises.

We can export constructors with module.exports.