Like any kind of apps, JavaScript apps also have to be written well.
Otherwise, we run into all kinds of issues later on.
In this article, we’ll look at some best practices we should follow when writing Node apps.
Adding a Reverse Proxy with Nginx
We should never expose our Express app directly to the Internet.
Instead, we should use a reverse proxy to direct traffic to our app.
This way, we can add caching, control our traffic, and more.
We can install it by running:
apt update
apt install nginx
Then we can start it by running:
systemctl start nginx
Once we have Nginx running, we can go into /etc/nginx/nginx.conf
to change the Nginx config to point to our app.
For example, we can write:
server {
listen 80;
location / {
proxy_pass http://localhost:3000;
}
}
We use the proxy_pass
directive to pass any traffic from port 80 to our Express app, which is listening to port 3000.
Then we restart Nginx to make our config take effect by running:
systemctl restart nginx
Load Balancing with Nginx
To add load balancing with Nginx, we can edit the nginx.conf
by writing:
http {
upstream fooapp {
server localhost:3000;
server domain2;
server domain3;
...
}
...
}
to add the instances of our app.
The upstream
section creates a server group that will load balance traffic across the servers we specify.
Then in the server
section, we add:
server {
listen 80;
location / {
proxy_pass http://fooapp;
}
}
to use the fooapp
server group.
Then we restart Nginx with:
systemctl restart nginx
Enabling Caching with Nginx
To enable caching, we can use the proxy_cache_path
directive.
In nginx.conf
, we write:
http {
proxy_cache_path /data/nginx/cache levels=1:2 keys_zone=STATIC:10m
inactive=24h max_size=1g;
...
}
We cache for 24 hours with the max cache size set to 1GB.
Also, we add:
server {
listen 80;
location / {
proxy_pass http://fooapp;
proxy_set_header Host $host;
proxy_buffering on;
proxy_cache STATIC;
proxy_cache_valid 200 1d;
proxy_cache_use_stale error timeout invalid_header updating
http_500 http_502 http_503 http_504;
}
}
proxy_buffering
set to on
to add buffering.
proxy_cache
is set to STATIC
to enable the cache.
proxy_cache_valid
sets the cache to expire in 1 day is the status code is 200.
proxy_cache_use_stale
determines when a stale cached response can be used during communication.
We set it to be used when there’s a timeout error, invalid header error when the current is updated with the updating
keyword.
We also can use the cache if the response status code is 500, 502, 503, or 504.
Enabling Gzip Compression with Nginx
We can enable Gzip compression with Nginx by using the gzip
modules.
For instance, we can write:
server {
gzip on;
gzip_types text/plain application/xml;
gzip_proxied no-cache no-store private expired auth;
gzip_min_length 1000;
...
}
in nginx.conf
to enable Gzip with gzip on
.
gzip_types
lets us set the MIME type for files to zip.
gzip_proxied
lets us enable or disable gzipping on proxied requests.
We enable it with the private
, expired
, no-cache
and no-store
Cache-Control
header values.
auth
means we enable compression if a request header enables the Authorization field.
gzip_min_length
means we set the minimum length of the response that will be gzipped.
The length is Content-Length
response header field and it’s in bytes.
Conclusion
There’re many things we can do with Nginx to improve the performance of our apps.