Categories
JavaScript Nodejs

Node.js FS Module — Opening Files

Manipulating files is a basic operation for any program. Since Node.js is a server-side platform and can interact with the computer that it’s running on directly, being able to manipulate files is a basic feature. Fortunately, Node.js has a fs module built into its library. It has many functions that can help with manipulating files and folders.

File and folder operations that are supported include basic ones like manipulating and opening files in directories. Likewise, it can do the same for files. It can do this both synchronously and asynchronously. It has an asynchronous API that has functions that support promises. Also, it can show statistics for a file.

Almost all the file operations that we can think of can be done with the built-in fs module. In this article, we will introduce the fs module and how to construct paths that can be used to open files and open files with it.

We will also experiment with the fs promises API to do equivalent operations that exist in the regular fs module if they exist in the fs promises API. The fs promise API functions return promises so this let us run asynchronous operations sequentially much more easily.

To use the fs module, we just have to require it like in the following line of code:

const fs = require('fs');

Asynchronous Operations

The asynchronous versions of the file functions take a callback that has an error and the result of the operation as the argument. If the operation is done successfully, then null or undefined will be passed in for the error argument, which should the first parameter of the callback function passed in for each function. Asynchronous operations aren’t executed sequentially. For example, if we want to delete a file called file, we can write:

const fs = require('fs');  
  
fs.unlink('/file', (err) => {  
  if (err) throw err;  
  console.log('successfully deleted file');  
});

We have the err parameter which has the error data if it exists, which will have it if an error occurred. Asynchronous operations aren’t done sequentially, to do multiple operations sequentially, we can either convert it to a promise or nest operations in the callback function of the earlier operation. For example, if we want to rename and then check for the existence of a file a file, we can shouldn’t write the following:

fs.rename('/file1', '/file2', (err) => {  
  if (err) throw err;  
  console.log('renamed complete');  
});  
fs.stat('/file1', (err, stats) => {  
  if (err) throw err;  
  console.log(stats);  
});

because they aren’t guaranteed to be run sequentially. Therefore, we should instead write:

fs.rename('./file1', './file2', (err) => {  
  if (err) throw err;  
  console.log('renamed complete'); 

  fs.stat('./file2', (err, stats) => {  
    if (err) throw err;  
    console.log(stats);  
  })  
;});

However, this gets messy if we want to do lots of operations sequentially. We have too much nesting of callback functions if we have multiple file operations. Too much nesting of callback functions is called callback hell and it makes reading and debugging the code very hard and confusing. Therefore, we should use something like promises do run asynchronous operations. For example, we should rewrite the example above like the following code:

const fsPromises = require("fs").promises;

(async () => {  
  try {  
    await fsPromises.rename("./files/file1.txt", "./files/file2.txt");  
    const stats = await fsPromises.stat("./files/file2.txt");  
    console.log(stats);  
  } catch (error) {  
    console.log(error);  
  }  
})();

We used the fs promises API which has file operations functions that return promises. This is much cleaner and takes advantage of the async and await the syntax for chaining promises. Note that the fs promises API has a warning that it’s experimental.

However, it has been quite stable so far and it’s good for basic file manipulation that requires chaining of multiple file operations. Note that we are catching errors with the try...catch block. async functions look like synchronous functions but can only return promises.

Each of the examples above will output something like the following:

Stats {  
  dev: 3605029386,  
  mode: 33206,  
  nlink: 1,  
  uid: 0,  
  gid: 0,  
  rdev: 0,  
  blksize: undefined,  
  ino: 6192449489177455,  
  size: 0,  
  blocks: undefined,  
  atimeMs: 1572568634188,  
  mtimeMs: 1572568838068,  
  ctimeMs: 1572569087450.1968,  
  birthtimeMs: 1572568634187.734,  
  atime: 2019-11-01T00:37:14.188Z,  
  mtime: 2019-11-01T00:40:38.068Z,  
  ctime: 2019-11-01T00:44:47.450Z,  
  birthtime: 2019-11-01T00:37:14.188Z }

Synchronous Operations

Synchronous file operations usually have the word Sync at the end of its name and are called line by line. We get the error with the try...catch block. If we rewrite the example above as a synchronous operation, we can write:

const fs = require('fs');  
  
try {  
  fs.unlinkSync('./files/file1.txt');  
  console.log('successfully deleted ./files/file1.txt');  
} catch (err) {  
  console.err(err);  
}

We get the err binding with the catch clause to get the error data. The issue with synchronous operations in Node.js is that it holds up the processor until the process is finished, which makes long, resource-intensive operations hang the computer. Therefore, it slows down the execution of the program.

File Paths

Most file operation functions accept a path in the form of a string, a Buffer or a URL object using the file: protocol. String paths are interpreted as UTF-8 character sequences which identifies the absolute or relative path of the file or folder.

Relative paths are resolved relative to the current working directory, which is whatever is returned by the process.cwd() function. For example, we can open a file with a relative path like in the following code:

const fs = require("fs");

fs.open("./files/file.txt", "r", (err, fd) => {  
  if (err) throw err;  
  fs.close(fd, err => {  
    if (err) throw err;  
  });  
});

The dot before the first slash indicates the current work directory of the relative path. Or using the promise API we can rewrite it to the following code:

(async () => {  
  try {  
    const fileHandle = await fsPromises.open("./files/file.txt", "r");  
    console.log(fileHandle);  
  } catch (error) {  
    console.log(error);  
  }  
})();

Note that the promise API doesn’t have the close function. The regular fs API puts the file operation permission flag as the second argument. r stands for read-only.

The fs promise API has an open function and you can get the fd object, which stands by file descriptor, which is the reference to the file we opened. We can call the close function by passing the file descriptor fd .

The fs promise API doesn’t have this so the file can’t be closed with the promise API.

The fs API also accepts a URL object as a reference to the file location. It has to have the file: protocol and they must be absolute paths. For example, we can create a URL object and pass it into the read function to read a file. We can write the following code to do this:

const fs = require("fs");  
const fileUrl = new URL(`file://${__dirname}/files/file.txt`);

fs.open(fileUrl, "r", (err, fd) => {  
  if (err) throw err;  
  fs.close(fd, err => {  
    if (err) throw err;  
  });  
});

It does the exact same thing as with using relative paths. It’s just slightly more complex since we have to get the URL object and get the current directory of the code with the __dirname object. On Windows, any path that has a hostname prepended to the path is interpreted as a UNC path, which is the path to access a file over the local area network or Internet. For example, if we have the following:

fs.readFileSync(new URL('file://hostname/path/to/file'));

Then that will be interpreted as accessing a file on the server with the hostname hostname .

Any file URL that has a drive letter will be interpreted as an absolute path on Windows. For example, if we have the following path:

fs.readFileSync(new URL('file://c:/path/to/file'));

Then it’ll try to access the file in the c:\path\to\file path. Any file that has no hostname must have drive letters. Therefore only file paths with the same format above are valid file paths for a URL object on Windows. Paths that start with a drive letter must have a colon after the drive letter. On all other platforms, file: URLs with a hostname are unsupported and file paths like fs.readFileSync(new URL('file://hostname/path/to/file')); will throw an error.

Any file: URL with the escaped slash character will throw an error on all platforms, so any of the following examples would be invalid and throw errors. On Windows, these examples would throw errors:

fs.readFileSync(new URL('file:///C:/path/%2F'));  
fs.readFileSync(new URL('file:///C:/path/%2f'));

And on POSIX systems, these would fail:

fs.readFileSync(new URL('file:///pathh/%2F'));  
fs.readFileSync(new URL('file:///path/%2f'));

On Windows, file URLs with the encoded backslash character will throw an error, so the following examples are invalid:

fs.readFileSync(new URL('file:///D:/path/%5C'));  
fs.readFileSync(new URL('file:///D:/path/%5c'));

On POSIX systems the kernel maintains a list of opened files and resources for every process. Each file that’s opened is assigned a simple numeric identifier called the file descriptor.

The operating system uses the file description to identify and track each specific file. On Windows, the tracking file uses a similar mechanism and file descriptors are still used for tracking files and resources opened by various processes.

Node.js does the hard work of assigning file descriptors to the resources so we don’t have to do it manually. This is handy for cleaning up resources that are opened.

In Node.js, the fs.open() function is used to open files and assign a new file descriptor to opened files. After processing is done, it can be closed by the close function so that the open resources can be closed and cleaned up. This can be used like in the following code:

const fs = require("fs");

fs.open("./files/file.txt", "r", (err, fd) => {  
  if (err) throw err;  
  console.log(fd);  
  fs.fstat(fd, (err, stat) => {  
    if (err) throw err;  
    console.log("Stat", stat); fs.close(fd, err => {  
      if (err) throw err;  
      console.log('Closed');  
    });  
  });  
});

If we run the code above, we get output that resembles the following:

3  
Stat Stats {  
  dev: 3605029386,  
  mode: 33206,  
  nlink: 1,  
  uid: 0,  
  gid: 0,  
  rdev: 0,  
  blksize: undefined,  
  ino: 22799473115106240,  
  size: 0,  
  blocks: undefined,  
  atimeMs: 1572569358035.625,  
  mtimeMs: 1572569358035.625,  
  ctimeMs: 1572569358035.625,  
  birthtimeMs: 1572569358035.625,  
  atime: 2019-11-01T00:49:18.036Z,  
  mtime: 2019-11-01T00:49:18.036Z,  
  ctime: 2019-11-01T00:49:18.036Z,  
  birthtime: 2019-11-01T00:49:18.036Z }  
Closed

In the code above, we opened the file with the open function, which provided the fd object which contains the file descriptor in the callback function, we can use that to get the information about the file with the fstat function. And after we’re done, we can close the opened file resource with the close function.

We done the file open operation, read the file metadata and then close the file with the close function.

The promise API do not have the fstat or close function so we can’t do the same thing with it.

The Node.js run time platform has a fs module built into its standard library. It has many functions that can help with manipulating files and folders.

File and folder operations that are supported include basic ones like manipulating and opening files in directories.

Likewise, it can do the same for files. It can do this both synchronously and asynchronously. It has an asynchronous API that has functions that support promises.

Also, it can show statistics for a file. Almost all the file operations that we can think of can be done with the built-in fs module. In this article, we will introduce the fs module and how to construct paths that can be used to open files and open files with it.

We will also experiment with the fs promises API to do equivalent operations that exist in the regular fs module if they exist in the fs promises API.

The fs promise API functions return promises so this lets us run asynchronous operations sequentially much more easily. We barely scratch the surface, so stay tuned for Part 2 of this series.

Categories
Express JavaScript Nodejs

To store sessions in our Express apps, we can use the cookie-session middleware.

It’s a cookie-based session middleware that stores user sessions on the client within a cookie.

In this article, we’ll look at how to store sessions within an Express app with the cookie-session middleware.

How Cookie-Session Stores Sessions?

cookie-session doesn’t require any database or other server-side resources to store sessions. The session data can’t exceed the browser’s max cookie side.

It can also simplify certain load-balanced scenarios. It can also store a light session and include an identifier to look up a database-backed secondary store to reduce database lookups.

Adding the Library

We can install cookie-session by running:

npm i cookie-session

Then we can use it as follows:

const cookieSession = require('cookie-session')  
const express = require('express')  
  
const app = express()  
  
app.use(cookieSession({  
  name: 'session',  
  keys: 'secret',  
  maxAge: 24 * 60 * 60 * 1000  
}))

Options

The cookie-session middleware lets us pass in an options object with various properties to set the value.

This middleware will automatically add a Set-Cookie header to the response if the contents of the req.session were altered.

The Set-Cookie header won’t be in the response unless there are contents in the session, so we should add something to req.session as soon as we have identifying information to store the session.

The following options are available:

name

Name of the cookie to be set and defaults to session .

keys

The list of keys to sign and verify cookie values, or a Keygrip instance. Set cookies are always signed with keys[0] , and other keys are valid for verification.

This allows for key rotation. It can be used to change signature parameters like the algorithm of the signature.

secret

The string will be used as a single key if keys isn’t provided.

Cookie Options

The cookie object within the options have the following properties:

  • maxAge — a number of milliseconds from Date.now() for expiry
  • expires — a Date object indicating the cookie’s expiration date
  • path — a string indicating the path of the cookie, which defaults to /
  • domain — a sting indication the domain of the cookie
  • sameSite — boolean or string indicating whether the cookie is a ‘same site’ cookie. The default value is false , 'strict' , 'lax' or true are other possible values
  • secure — boolean indicating whether the cookie is only sent over HTTPS. If this is true and communication isn’t through TLS, then the cookie may not be set correctly
  • httpOnly — boolean indicating whether the cookie is sent over HTTP(S) and not available to client JavaScript
  • signed — boolean indicating whether the cookie is signed. Default value is true . If it’s true , then the cookie is appended with a .sig suffix.
  • overwrite — boolean indicating whether to overwrite previously set cookies of the same name

req.session

req.session holds the session for the given request.

.isChanged

A method that returns trie is the session has been changed during the request.

.isNew

A method that returns true if the session is new.

.isPopulated

A method that returns true is the session has been populated with data.

req.sessionOptions

We can set the sessionOptions object to change cookie settings.

Destroying a Session

We can set req.session to null to destroy a session.

Example

We can track the number of views a user made as follows:

const express = require('express');  
const bodyParser = require('body-parser');  
const cookieSession = require('cookie-session');  
const app = express();  
app.set('trust proxy', 1);  
app.use(cookieSession({  
  name: 'session',  
  keys: ['key1', 'key2']  
}))
app.get('/', (req, res, next) => {  
  req.session.numViews = (req.session.numViews || 0) + 1  
  res.end(`${req.session.numViews} views`);  
})
app.listen(3000);

In the code above, we just set req.session with the properties that we want to add and set the values the way we want.

The numViews property is added to track the number of views and increments as more request to the / is made.

Setting Max Age

We can set the maxAge as follows:

const express = require('express');  
const bodyParser = require('body-parser');  
const cookieSession = require('cookie-session');  
const app = express();  
app.set('trust proxy', 1);  
app.use(cookieSession({  
  name: 'session',  
  keys: ['key1', 'key2'],  
  maxAge: 1000,  
}))
app.use((req, res, next) => {  
  req.sessionOptions.maxAge = req.session.maxAge || req.sessionOptions.maxAge  
  next()  
})
app.get('/', (req, res, next) => {  
  req.session.numViews = (req.session.numViews || 0) + 1  
  res.end(`${req.session.numViews} views`);  
})
app.listen(3000);

We set maxAge to 1000 milliseconds above, then we’ll see that the values reset to 1 as the session expires.

In both cases, the Set-Cookie response header will be set as the session is set for the client.

We have multiple keys with the first one for encryption and the second one for verification.

Keygrip Integration

We can use keygrip as follows for encrypting our sessions:

const express = require('express');  
const bodyParser = require('body-parser');  
const cookieSession = require('cookie-session');  
const Keygrip = require('keygrip');  
const app = express();  
app.set('trust proxy', 1);  
app.use(cookieSession({  
  name: 'session',  
  keys: new Keygrip(['key1', 'key2'], 'SHA384'),  
  maxAge: 60 * 60 * 1000,  
}))
app.use((req, res, next) => {  
  req.sessionOptions.maxAge = req.session.maxAge || req.sessionOptions.maxAge  
  next()  
})
app.get('/', (req, res, next) => {  
  req.session.numViews = (req.session.numViews || 0) + 1  
  res.end(`${req.session.numViews} views`);  
})
app.listen(3000);

In the code above, we set the encryption algorithm to SHA384 for secure encryption.

Conclusion

The cookie-session lets us store sessions securely. We can use it to store identifying information about a user securely by encrypting the content with keys.

Then we can get and set the data via req.session .

Various options are also available for sessions. We can set maxAge to set when the session expires.

We can use domain , path , sameSite , secure , and httpOnly to control what kind of domain and path to set in our cookies, and also set what sites the cookies will be set for.

cookie-session also works with Keygrip for encryption.

Categories
Express JavaScript

Guide to the Express Application Object — Rendering and Setting

The core part of an Express app is the Application object. It’s the application itself.

In this article, we’ll look at the methods of the app object and what we can do with it, including rendering HTML and changing settings.

app.render(view, [locals], callback)

We can use the app.render method to render HTML of a view via its callback function. It takes an optional parameter that’s an object containing variables for the view.

For example, we can use it as follows:

const express = require('express');  
const bodyParser = require('body-parser');  
const app = express();
app.use(bodyParser.json());  
app.use(bodyParser.urlencoded({ extended: true }));
app.engine('ejs', require('ejs').renderFile);  
app.set('view engine', 'ejs');
app.render('index', { people: ['geddy', 'neil', 'alex'] }, (err, html) => {  
  console.log(html);  
});
app.listen(3000);

Then if we have the following in views/index.ejs :

<%= people.join(", "); %>

Then we get:

geddy, neil, alex

outputted from console.log(html);

app.route(path)

We can use app.route to define route handlers with the given path .

For example, we can use it as follows:

const express = require('express');  
const bodyParser = require('body-parser');  
const app = express();
app.use(bodyParser.json());  
app.use(bodyParser.urlencoded({ extended: true }));
app.route('/')  
  .get((req, res, next) => {  
    res.send('GET request called');  
  })  
  .post((req, res, next) => {  
    res.send('POST request called');  
  })  
  .all((req, res, next) => {  
    res.send('Other requests called');  
  })app.listen(3000);

Then when a GET request is made to / , we get GET request called . If a POST request is made to / , then we get POST request called .

Any other kind of requests to / will get us Other requests called .

The order matters since all will handle all kinds of requests. So if we want to listen to specific kinds of requests in addition to other kinds of requests, all should come last.

app.set(name, value)

We can use set to set the setting with the given name to the given value .

For example, we can use it as follows:

const express = require('express');  
const bodyParser = require('body-parser');  
const app = express();
app.use(bodyParser.json());  
app.use(bodyParser.urlencoded({ extended: true }));  
app.set('title', 'Foo');
app.get('/', (req, res) => {  
  res.send(app.get('title'));  
})
app.listen(3000);

Since we called app.set(‘title’, ‘Foo’); to set the title setting to Foo we should see Foo displayed when we make a GET request to / .

Some settings are special for Express. They include:

  • case sensitive routing — boolean value for enabling or disabling case sensitive routing (e.g. /Foo will be considered different from /foo if this is true )
  • env — string for the environment mode
  • etag — ETag response header
  • jsonp callback name — string for the JSONP callback name
  • json escape — boolean option to enable or disable escaping JSON response from res.json , res.jsonp or res.send . <, >, and & will be escaped if this is true
  • json replacer — replace callback for JSON.stringify
  • json spaces — spaces argument for JSON.stringify
  • query parser — disable query parsing if set to false , or set the query parse to use either 'simple' or 'extended' or a custom query string parsing function.
  • strict routing — boolean setting for enabling/disabling strict routing. If this is true , then /foo will be considered different from /foo/
  • subdomain offset — number of dot-separated parts of the host to remove to access subdomain. Defaults to 2.
  • trust proxy — indicates that the app is behind a proxy is it’s true . The X-Forwarded-* headers will determine the connection and IP address of the client.
  • views — string or array of directories to look for view templates. If it’s an array, then the views will be searched in the order they’re listed
  • view cache — boolean to enable view template compilation caching
  • view engine — string for setting view engine for rendering templates.
  • x-powered-by — enable 'X-Powered-By: Express HTTP header

Options for `trust proxy` setting

It can take on the following options:

  • true — client’s IP address is understood to be the leftmost entry of the X-Forwarded-* header
  • false — the app is assumed to be directly facing the Internet
  • String, comma-separated strings, or array of strings — one or more subnets or IP address to trust
  • Number — trust the nth hop from the front-facing proxy server as the client.
  • Function — custom trust implementation

Options for etag setting

It can take on the following options:

  • Boolean — true enables weak ETag, false disables ETag
  • String — 'strong' enables strong ETag, ‘weak’ enables weak ETag
  • Function — custom implementation.

Conclusion

We can render an HTML string with the app.render method. It takes a view file name, an object for variables and a callback with the html parameter to get the HTML rendered.

app.route lets us define route handlers.

app.set lets us set the options we want for our app. Some settings are special for Express and will be processed by it if they’re set.

Categories
JavaScript Nodejs

Node.js FS Module — Opening Directories

Manipulating files and directories are basic operations for any program. Since Node.js is a server side platform and can interact with the computer that it’s running on directly, being able to manipulate files is a basic feature. Fortunately, Node.js has a fs module built into its library. It has many functions that can help with manipulating files and folders. File and directory operation that are supported include basic ones like manipulating and opening files in directories. Likewise, it can do the same for files. It can do this both synchronously and asynchronously. It has an asynchronous API that have functions that support promises. Also it can show statistics for a file. Almost all the file operations that we can think of can be done with the built in fs module. In this article, we will use the functions in the fs module to open and manipulate directories. We will also experiment with the fs promises API to do equivalent operations that exist in the regular fs module if they exist in the fs promises API. The fs promise API functions return promises so this let us run asynchronous operations sequentially much more easily.

Opendir Promise

The fs module can open, manipulate and close directories. Opening directories with the module is a simple task. To open a directory in a Node.js program, we can use the opendir function to do it. It takes a path object as an argument. The path can be a string, Buffer or an URL object. To open a directory, we can write the following code with the fs promise API, which is available in Node.js 12 LTS or later:

const fs = require("fs");

(async (path) => {  
  const dir = await fs.promises.opendir(path);  
  for await (const dirent of dir) {  
    console.log(dirent.name);  
  }  
})('./files');

The code above will list all the files and folders that are in the given folder. If you run the code above, you should get something like the following:

.keep  
file.txt  
file2.txt

Readdir

If you’re using a Node.js version that’s before 12, then we have to use the older fs API, which doesn’t have functions that return promises. We have to use the readdir function instead to list the contents of a directory. The readdir function takes 2 arguments. The first is the path object, which can be a string, an URL object or a Buffer, and the second argument is a callback function that’s called when the result is produced. The first parameter of the callback is the error object which is defined when there’s an error, and the second is an array with the actual results. For example, if we want to view the content of the ./files folder, we can write:

const fs = require("fs");

fs.readdir("./files", (err, items) => {  
  for (const dirent of items) {  
    console.log(dirent);  
  }  
});

The code above will get us similar output if we run it. It will open the directory path asynchronously, and then list the contents when it’s ready. If we have the same content as the directory above, we will see the following:

.keep  
file.txt  
file2.txt

There’s also a synchronous version of the readdir function called the readdirSync . It takes one argument, which is the path object, which can be a string, an URL object or a Buffer. It returns an array with the content of the given directory path. We can do the same thing by writing the following code:

const fs = require("fs");

const items = fs.readdirSync("./files");  
for (const dirent of items) {  
  console.log(dirent);  
}

If we run the code above, we get the same output as we did before if we have the same contents in the given directory as before.

The opendir function and its synchronous version opendirSync function let us get the directory resource handle of the folder with the given path object. The path object can be a string, an URL object or a buffer. Then asynchronous version, opendir , takes the path object as the first argument and callback function which takes the error object as the first parameter and the directory resource handle object as the second parameter. It’s called when the directory is opened. With the directory handle

Whenever you’re done with opening the directory with the opendir, opendirSync, and manipulating it, we should close it to clean up any reference of it from memory to prevent memory leaks. To do this, Node.js has a close function, which takes a callback function with an error object parameter to get the error when there’s one. It will close the open directory’s resource handle asynchronously. There’s also a synchronous version called the closeSync function that takes no arguments and closes the directory’s resource handle like the close function did, but it holds up the Node.js program’s process until it’s finished. This doesn’t apply to the fs promise API version of opendir since it closes it automatically. For example, if we want to open a directory asynchronously, we can write:

fs.opendir("./files", (err, dir) => {  
  console.log(dir);  
  dir.read((err, dirent) => {  
    console.log(dirent);  
    dir.close();  
  });  
});

The code above will open the directory with the opendir function, which takes a path object and a callback that takes an error object and the directory handle as parameters. Then we call the read function to open the directory and get its description and the name..

Then we call close on the directory resource handle object in the callback passed into the read function which closes the resource and frees up the memory. If we run it, we will get output the looks something like the following:

Dir {  
  [Symbol(kDirHandle)]: DirHandle {},  
  [Symbol(kDirPath)]: './files',  
  [Symbol(kDirClosed)]: false,  
  [Symbol(kDirOptions)]: { encoding: 'utf8' },  
  [Symbol(kDirReadPromisified)]: [Function: bound read],  
  [Symbol(kDirClosePromisified)]: [Function: bound close]  
}  
Dirent { name: '\u0010S.u\u0001', [Symbol(type)]: 777167552 }

OpendirSync

A synchronous version of the opendir function, which is the opendirSync function, can be used with the readSync function to get the directory’s metadata. For example, we can write:

const fs = require("fs");const dir = fs.opendirSync("./files");  
const dirData = dir.readSync();  
console.log(dirData)  
dir.closeSync();

The dirData variable will have something like the following output:

Dirent { name: '.keep', \[Symbol(type)\]: 1 }

That is the data of the content of the directory. To get the path of the opened directory, we can use the path function, which is a function that’s part of the directory object. For example, we can write the following to get the path of the currently opened directory:

const fs = require("fs");fs.opendir("./files", (err, dir) => {  
  console.log(dir.path);  
});

If we run the code above, the we will get output that’s similar to the following:

./files

The output row is the output of the first console.log statement, which is the path of the currently open directory. This isn’t very useful here, but it can be used when the directory resource handle is from an indefinite origin, like when the directory resource handle object is passed in as an argument of a function.

To read the contents of the directory with a directory resource handle, we can use the read function, which is part of the directory resource handle object.

It takes no arguments and a promise is returned that’ll be resolved with the fs.Dirent object, which is an iterable object that has the names of the items in a directory.

The read function without any arguments returns a promise that resolves with the same Dirent object as the example above. For example, we can use the read like in the following code with the asynchronous opendir function:

const fs = require("fs");fs.opendir("./files", (err, dir) => {  
  dir.read((err, dirent) => {  
    console.log(dirent);  
    dir.close();  
  });  
});

If we run the code, we get something like the following output:

Dirent { name: '.keep', \[Symbol(type)\]: 1 }

The output above has the Dirent object, which has the name of the first file in the directory.

<img class="s t u gw ai" src="https://miro.medium.com/max/5854/0*M3wUa7pqdaGu7xMY" width="2927" height="3903" srcSet="https://miro.medium.com/max/552/0*M3wUa7pqdaGu7xMY 276w, https://miro.medium.com/max/1104/0*M3wUa7pqdaGu7xMY 552w, https://miro.medium.com/max/1280/0*M3wUa7pqdaGu7xMY 640w, https://miro.medium.com/max/1400/0*M3wUa7pqdaGu7xMY 700w" sizes="700px" role="presentation"/>

Photo by Omid Kashmari on Unsplash

Listing Multiple Pieces of Directory Content with Opendir

We can read more than one file in the directory, by nested the read function in the callback of the outer read function. For example, we can write:

const fs = require("fs");fs.opendir("./files", (err, dir) => {  
  console.log(dir.path);  
  dir.read((err, dirent) => {  
    console.log(dirent);  
    dir.read((err, dirent) => {  
      console.log(dirent);  
      dir.close();  
    });  
  });  
});

Then we get output with something like the following text:

./files  
Dirent { name: '.keep', \[Symbol(type)\]: 1 }  
Dirent { name: 'file.txt', \[Symbol(type)\]: 1 }

The first row of the output contains the folder path, which we got with dir.path . The next 2 rows are the first 2 items of the directory. Every time we call dir.read , we get the next item of the directory.

To read the contents of the directory with a directory resource handle, we can use the read function, which is part of the directory resource handle object. It takes no arguments and a promise is returned that’ll be resolved with the fs.Dirent object, which is an iterable object that has the names of the items in a directory. The read function without any arguments returns a promise that resolves with the same dirent object as the example above. For example, we can use the read like in the following code with the asynchronous opendir function:

const fs = require("fs");fs.opendir("./files", async (err, dir) => {  
  console.log(dir.path);  
  const item = await dir.read();  
  console.log(item)  
  dir.close();  
});

If we run the code above, we get something like the following:

./files  
Dirent { name: '.keep', \[Symbol(type)\]: 1 }

If we want to get the first 2 items in the directory, with the promise version of dir.read , we can write the following:

const fs = require("fs");fs.opendir("./files", async (err, dir) => {  
  console.log(dir.path);  
  const item1 = await dir.read();  
  console.log(item1)  
  const item2 = await dir.read();  
  console.log(item2)  
  dir.close();  
});

We should get the same output as what we had above assuming we got the same folder contents as before. As we can see, it’s not practical to use those methods to get more items inside the directory those 2 examples as have above of the read function. If we want to get more than 2 items in the directory with the read function, we can write the following with the promise version of the read function, which takes no arguments:

const fs = require("fs");fs.opendir("./files", async (err, dir) => {  
  while ((item = await dir.read())) {  
    console.log(item);  
  }  
  dir.close();  
});

If we run the code above, we get something like the following output:

Dirent { name: '.keep', \[Symbol(type)\]: 1 }  
Dirent { name: 'file.txt', \[Symbol(type)\]: 1 }  
Dirent { name: 'file2.txt', \[Symbol(type)\]: 1 }  
Dirent { name: 'folder1', \[Symbol(type)\]: 2 }  
Dirent { name: 'folder2', \[Symbol(type)\]: 2 }

Note that we can use the await keyword inside a while loop to repeatedly run the dir.read() function call until null or undefined is returned. In the code above, we looped through all the items in the directory and logged the content of the Dirent object.

To check if a Dirent object is a directory, we can use the isDirectory() function. Likewise, we can check if an Dirent is a file with the isFile() function. For example, we can write the following to check if a Dirent is a file or a folder for the contents of a given directory with the following code:

const fs = require("fs");

fs.opendir("./files", async (err, dir) => {  
  while ((item = await dir.read())) {  
    console.log(item);  
    console.log(`Is directory: ${item.isDirectory()}`);  
    console.log(`Is file: ${item.isFile()}\n`);  
  }  
  dir.close();  
});

When we run the code above, we get something like the following output:

Dirent { name: '.keep', [Symbol(type)]: 1 }  
Is directory: false  
Is file: trueDirent { name: 'file.txt', [Symbol(type)]: 1 }  
Is directory: false  
Is file: trueDirent { name: 'file2.txt', [Symbol(type)]: 1 }  
Is directory: false  
Is file: trueDirent { name: 'folder1', [Symbol(type)]: 2 }  
Is directory: true  
Is file: falseDirent { name: 'folder2', [Symbol(type)]: 2 }  
Is directory: true  
Is file: false

As we can see, the isDirectory and isFile functions can identify if an item in a directory is another directory and it can also identify whether it’s a file or not.

With the Node.js’ built in fs module, we can open directories and list its contents easily. It has many functions that can help with opening directories and files inside it. It can do this both synchronously and asynchronously. It has an asynchronous API that have functions that support promises.

Also, it can show statistics for a file. Almost all the file operations that we can think of can be done with the built-in fs module. In this article, we used the functions in the fs module to open directories and list directory contents.

We also used some functions in the fs module that returns promises. The fs functions that return promises let us run asynchronous operations sequentially much more easily.

Categories
Express JavaScript Nodejs

Guide to the Express Router Object — Multiple Requests and Middleware

The Expressrouter object is a collection of middlewares and routes. It a mini-app within the main app.

It can only perform middleware and routing functions and can’t stand on its own. It also behaves like middleware itself, so we can use it with app.use or as an argument to another route’s use method.

In this article, we’ll look at the router object’s methods, including route and use.

Methods

router.route(path)

We can use router.route to handle multiple kinds of requests with one chain of function calls.

For example, we can use it as follows:

const express = require('express');  
const bodyParser = require('body-parser');
const app = express();  
const router = express.Router();
app.use(bodyParser.json());  
app.use(bodyParser.urlencoded({ extended: true }));
router.route('/')  
  .all((req, res, next) => {  
    next()  
  })  
  .get((req, res, next) => {  
    res.send('foo');  
  })  
  .put((req, res, next) => {  
    next();  
  })  
  .post((req, res, next) => {  
    next();  
  })  
  .delete((req, res, next) => {  
    next();  
  })
app.use('/foo', router);  
app.listen(3000);

Then when we make a GET request to /foo, we get foo.

We can also write something like:

const express = require('express');  
const bodyParser = require('body-parser');
const app = express();  
const router = express.Router();
app.use(bodyParser.json());  
app.use(bodyParser.urlencoded({ extended: true }));
router.route('/')  
  .all((req, res, next) => {  
    console.log('all requests');  
    next();  
  })  
  .get((req, res, next) => {  
    console.log('get request');  
    res.send('foo');  
    next();  
  })  
  .put((req, res, next) => {  
    console.log('put request');  
    res.send('put');  
    next();  
  })  
  .post((req, res, next) => {  
    console.log('post request');  
    res.send('post');  
    next();  
  })  
  .delete((req, res, next) => {  
    console.log('delete request');  
    res.send('delete');  
    next();  
  })
app.use('/foo', router);  
app.listen(3000);

Then when a GET, POST, PUT or DELETE requests are made to /foo, then we get the corresponding response.

The middleware ordering is based on when the route is created and not when the handlers are added to the route.

router.use([path], [function, …] function)

The router.use method lets us add one or more middleware functions with an optional mount path, which defaults to /.

It’s similar to app.use, except it’s applied to the router object instead of the whole app.

For example, if we have:

const express = require('express');  
const bodyParser = require('body-parser');
const app = express();  
const router = express.Router();
app.use(bodyParser.json());  
app.use(bodyParser.urlencoded({ extended: true }));
router.use((req, res, next) => {  
  console.log('router middleware called');  
  next();  
})
router.get('/', (req, res) => {  
  res.send('foo');  
})
app.get('/', (req, res) => {  
  res.send('hi');  
})
app.use('/foo', router);  
app.listen(3000);

Then we get router middleware called when we make a GET request to the /foo path since we’re going through the router middleware to do that.

On the other hand, if we make a GET request to /, we don’t get that message logged since we aren’t using the router object to handle that request.

This means that attaching a middleware to the router object let us do something before a route that’s handled router object is handled.

We can also chain multiple middlewares as comma-separated list of middlewares, an array of middlewares or both together.

For example, we can write:

const express = require('express');  
const bodyParser = require('body-parser');
const app = express();  
const router = express.Router();
app.use(bodyParser.json());  
app.use(bodyParser.urlencoded({ extended: true }));
const mw1 = (req, res, next) => {  
  console.log('mw1 called');  
  next();  
}
const mw2 = (req, res, next) => {  
  console.log('mw2 called');  
  next();  
}
const mw3 = (req, res, next) => {  
  console.log('mw3 called');  
  next();  
}
router.use([mw1, mw2], mw3);
router.get('/', (req, res) => {  
  res.send('foo');  
})
app.use('/foo', router);  
app.listen(3000);

Then all 3 middlewares, mw1 , mw2 , and mw3 all get run, so we get:

mw1 called  
mw2 called  
mw3 called

logged if we make a GET request to /foo .

We can also specify a path for the middleware. If it’s specified, then only request made to the specified path is handled.

For example, if we have:

const express = require('express');  
const bodyParser = require('body-parser');
const app = express();  
const router = express.Router();
app.use(bodyParser.json());  
app.use(bodyParser.urlencoded({ extended: true }));
router.use('/bar', (req, res, next) => {  
  console.log('middlware called');  
  next();  
});
router.get('/bar', (req, res) => {  
  res.send('bar');  
})
router.get('/', (req, res) => {  
  res.send('foo');  
})
app.use('/foo', router);  
app.listen(3000);

Then when a GET request is made to /foo/bar , then our middleware is called, so we get middleware called in this case.

If we make a GET request to /foo , then our middleware doesn’t get called.

We can also pass in a pattern or regex for the path that we pass into use .

For example, we can write:

const express = require('express');  
const bodyParser = require('body-parser');
const app = express();  
const router = express.Router();
app.use(bodyParser.json());  
app.use(bodyParser.urlencoded({ extended: true }));
router.use('/ab?c', (req, res, next) => {  
  console.log('middlware called');  
  next();  
});
router.get('/ab?c', (req, res) => {  
  res.send('bar');  
})
app.use('/foo', router);  
app.listen(3000);

Then when we make a GET request to /foo/abc , we get middleware called logged. We get the same result with /foo/ac .

We can pass in a regex as the path as follows:

const express = require('express');  
const bodyParser = require('body-parser');
const app = express();  
const router = express.Router();
app.use(bodyParser.json());  
app.use(bodyParser.urlencoded({ extended: true }));
router.use('/a(bc)?$', (req, res, next) => {  
  console.log('middlware called');  
  next();  
});
router.get('/a(bc)?$', (req, res) => {  
  res.send('bar');  
})
app.use('/foo', router);  
app.listen(3000);

Then when we make a GET request to /foo/abc or /foo/a , we get middleware called logged.

Conclusion

We can handle multiple kinds of requests in a router object with the route method. It lets us chain a series of method calls and pass in route handlers for each kind of request.

With the use method, we can pass in middleware that only applies to routes handled by the router object. Any other route won’t invoke the middleware.

We can pass in a string or regex path to the use method to only handle requests that are made to certain paths.