Categories
Modern JavaScript

Best of Modern JavaScript — Generator Best Practices

Since 2015, JavaScript has improved immensely.

It’s much more pleasant to use it now than ever.

In this article, we’ll look at JavaScript generators and new regex features.

The Asterisk

The asterisk for generator functions has an asterisk.

If must be between the function keyword and the function name.

And it can have any kind of spacing.

But usually, it has this format:

function* gen() {
  //..
}

Generator Function Declarations and Expressions

Generator function declarations and expressions are both valid.

We can write generator function declarations by writing:

function* gen() {
  //..
}

And we can create generator expressions by assigning a function declaration to a variable:

const gen = function* () {
  //..
}

Generator Method Definitions

We can add generator method definitions by writing:

const obj = {
  * gen(x, y) {
    //...
  }
};

We have the asterisk before the method name.

This is a shorthand for the following:

const obj = {
  gen: function*(x, y) {
    //...
  }
};

Generator method definitions are similar to getters and setters.

For example, we can write:

const obj = {
  get foo() {
    //...
  },
  set foo(value) {
    //...
  }
};

Like the asterisk, get and set are modifiers for the foo methods.

Recursive yield

We can add an asterisk to yield to call another generator function from a generator function.

For instance, we can write:

function* foo(x) {
  //...
  yield* foo(x - 1);
  //..
}

We recursively call foo with the yield* keyword.

Why Use the function* Keyword for Generator Functions?

The function* keyword is chosen for generator functions because the keyword generator might be used by something else.

yield

yield is a reserved word in strict mode.

So it can’t be used with it on.

If the ES6 code is in sloppy mode, then the keyword becomes a contextual keyword that’s available only inside generators.

New Regex Features

New regex features include the /y flag to let us anchor the starting point for the search to the lastIndex property of a regex object.

This is similar to the ^ anchor, but the match for ^ always starts from 0.

This means the match is useful for use with the g flag.

We can use it to match a pattern multiple times.

It’s also useful to find matches immediately after its predecessor.

For example, we can use the regex.prototype.exec method to find all the matches from a string.

If we have:

const REGEX = /foo/;

REGEX.lastIndex = 8;
const match = REGEX.exec('barfoobarfoo');
console.log(match.index);
console.log(REGEX.lastIndex);

Then index is ignores the lastIndex property of the regex, but if we add the g flag:

const REGEX = /foo/g;

REGEX.lastIndex = 8;
const match = REGEX.exec('barfoobarfoo');
console.log(match.index);
console.log(REGEX.lastIndex);

The pattern search starts with the lastIndex index value.

If y is set:

const REGEX = /foo/y;

REGEX.lastIndex = 9;
const match = REGEX.exec('barfoobarfoo');
console.log(match.index);
console.log(REGEX.lastIndex);

If we set the lastIndex to 9, then we get the 'foo' match with that index.

The index has to the exact start point of the match for the y flag to pick up the match.

lastIndex will be updated to 12 after the match is found.

Setting /g and /y together is the same as setting y .

Conclusion

Generator code style can vary.

Also, JavaScript regex objects now can use the y flag to find the exact match after the flag.

Categories
Modern JavaScript

Best of Modern JavaScript — for-of Loop

Since 2015, JavaScript has improved immensely.

It’s much more pleasant to use it now than ever.

In this article, we’ll look at JavaScript iterable objects.

Iterable Data Sources

We can use the for-of loop to iterate through various kinds of iterable objects.

For example, we can loop through an array by writing:

const arr = ['foo', 'bar', 'baz'\];
for (const x of arr) {
  console.log(x);
}

Then we get:

foo
bar
baz

logged.

We can also use it with strings.

For example, we can write:

const arr = \['foo', 'bar', 'baz'\];
for (const x of 'foo') {
  console.log(x);
}

Then we get the letters of 'foo' logged individually.

Maps are also itrerable objects, so we can use it with the for-of loop.

For instance, we can write:

const map = new Map().set('foo', 1).set('bar', 2);
for (const \[key, value\] of map) {
  console.log(key, value);
}

We created a map with some keys and values with the Map constructor.

Then in the for-of loop, we looped through each entry and destructured the key and value from the map entries.

So we get:

foo 1
bar 2

from the console log.

Sets are also iterables that can be iterated through.

For example, we can write:

const set = new Set().add('foo').add('bar');
for (const x of set) {
  console.log(x);
}

We created a set and added some items to it.

Then we used the for-of loop with it.

And so we get:

foo
bar

from the console log.

arguments is an object that’s available inside traditional functions to get the arguments from a function call.

We can use it with the for-of loop as it’s an iterable object:

function logArgs() {
  for (const x of arguments) {
    console.log(x);
  }
}
logArgs('foo', 'bar');

We passed in 'foo' and 'bar' to the logArgs function and we get the values from the for-of loop.

DOM NodeLists can also be iterated through with the for-of loop.

For instance, we can write:

for (const div of document.querySelectorAll('div')) {
  console.log(div);
}

to loop through all the divs on the page.

Iterable Computed Data

If a function returns an iterable object, we can loop through it with the for-of loop.

For instance, we can write:

const arr = \['a', 'b', 'c'\];
for (const \[index, element\] of arr.entries()) {
  console.log(index, element);
}

Then we get:

0 "a"
1 "b"
2 "c"

We get the indexes and the elements with the arr.entries method and we get the values each entry inside the loop body.

Plain Objects are not Iterable

Plain objects aren’t iterable, so we can’t use them with the for-of loop.

For instance, something like:

for (const x of {}) {
  console.log(x);
}

would give us the ‘Uncaught TypeError: {} is not iterable’ error.

Objects aren’t iterable because either we loop through the properties of an object to examine its structure.

Or we loop through the data.

This should be kept separate to avoid confusing objects with other kinds of data structures.

If we want to make an object iterable, we can create the Symbol.iterator generator method to make it so.

Conclusion

The for-of loop can iterate through many kinds of iterable objects.

Categories
Modern JavaScript

Best of Modern JavaScript — Closable Iterators

Since 2015, JavaScript has improved immensely.

It’s much more pleasant to use it now than ever.

In this article, we’ll look at JavaScript iterable objects.

Closable Iterators

An iterator is closable if it has the return method.

This is an optional feature.

Array iterators aren’t closable.

For instance, if we write:

const arr = ['foo', 'bar', 'baz'];
const iterator = arr[Symbol.iterator]();
console.log('return' in iterator)

We see that there’s no return method.

On the other hand, generator objects are closable by default.

For instance, if we have:

function* genFn() {
  yield 'foo';
  yield 'bar';
  yield 'baz';
}

const gen = genFn();

We can finish calling the generator gen by calling the return method on it.

For instance, we can write:

function* genFn() {
  yield 'foo';
  yield 'bar';
  yield 'baz';
}

const gen = genFn();
console.log(gen.next());
gen.return();
console.log(gen.next());
console.log(gen.next());

After we called return , then the generator is done.

So the last 2 return calls give us:

{value: undefined, done: true}

If an iterator isn’t closable, we can continue iterating over it after an exit from a for-of loop.

So if we have an iterable with a non-closable iterator like an array, we can continue looping through it:

const arr = ['foo', 'bar', 'baz'];

for (const x of arr) {
  console.log(x);
  break;
}
for (const x of arr) {
  console.log(x);
}

After we used break to break out of the first loop, the 2nd loop still iterate from start to end.

Prevent Iterators from Being Closed

We can prevent iterators from being closed by changing the return method.

If we return done set to false , then it can’t be closed.

For example, we can write:

class NoReturn {
  constructor(iterator) {
    this.iterator = iterator;
  }
  [Symbol.iterator]() {
    return this;
  }
  next() {
    return this.iterator.next();
  }
  return (value) {
    return {
      done: false,
      value
    };
  }
}

function* genFn() {
  yield 'foo';
  yield 'bar';
  yield 'baz';
}
const gen = genFn();
const noReturn = new NoReturn(gen);

for (const x of noReturn) {
  console.log(x);
  break;
}
for (const x of noReturn) {
  console.log(x);
}

Since we passed our generator, which is closeable, to the NoReturn constructor, it’ll continue iterating through the items.

So we get:

foo
bar
baz

logged.

It just continues the iteration from where it left off since it hasn’t been closed.

So it can continue iteration from where it left off.

We can also make generators unclosable by setting the prototype.return method to undefined .

However, we should be aware that this doesn’t work with all transpilers.

For instance, we can write:

function* genFn() {
  yield 'foo';
  yield 'bar';
  yield 'baz';
}

genFn.prototype.return = undefined;

We set the genFn.prototype.return property to be undefined .

Then we can confirm that it’s unclosable by writing:

const gen = genFn();

for (const x of gen) {
  console.log(x);
  break;
}
for (const x of gen) {
  console.log(x);
}

We get:

foo
bar
baz

from the 2 loops.

Using break didn’t close the generator.

Conclusion

We can adjust whether we can close an iterator.

If closing is disabled, this will let us loop through the iterator items after breaking and similar operations.

Categories
Modern JavaScript

Best of Modern JavaScript — Async and Promises

Since 2015, JavaScript has improved immensely.

It’s much more pleasant to use it now than ever.

In this article, we’ll look at JavaScript async programming.

Sync and Async

The synchronous code always runs before the async code.

So if we have:

setTimeout(function() {
  console.log(2);
}, 0);
console.log(1);

Then we get:

1
2

The console log outside is synchronous, so it runs first.

The callback’s console log runs after the synchronous code, since it’s async.

The event loop may be blocked if we run something that’s synchronous since the whole app is managed by a single process.

Both the user interface and other computations are all in one thread.

So if one thing runs, then whatever comes after it can’t run until that thing is done.

For instance, if we have a synchronous sleep function, then that’ll pause the execution of the whole program.

If we have:

function sleep(ms) {
  const start = Date.now();
  while ((Date.now() - start) < ms);
}

console.log('start');
sleep(2000);
console.log('end');

Then we see 'start' logged first, then wait 2 seconds, then 'end' is logged.

The sleep function uses a while loop, so it’s synchronous.

This means it’ll hold up the whole program from running.

Avoiding Blocking

To avoid blocking the UI thread, we can use different kinds of async code like web workers, or setTimeout or Promises.

One example of async code is the Fetch API.

We can get data by making HTTP requests without blocking the whole thread.

For instance, we can write:

fetch('https://api.agify.io/?name=michael')
  .then(res => res.json())
  .then(res => {
    console.log(res);
  })

to use the fetch function to get the data.

It returns a promise so it’s async.

Then then callbacks are called only when the results are ready.

When it’s not the promise is paused until a result is obtained.

Asynchronous Results

Other examples, include Node.js async callbacks.

For example, the readFile method takes a callback that’s run asynchronously when the file is read.

We can write:

fs.readFile('foo.txt', {
    encoding: 'utf8'
  },
  function(error, text) {
    if (error) {
      // ...
    }
    console.log(text);
  });

Then we read file.txt from with readFile .

Callbacks

Callbacks are problematic because error handling is complicated.

We’ve to handle errors at each callback.

The signatures are also less elegant because there’s no separation of concerns between inputs and outputs.

Async functions use callback functions, which can’t return anything.

Composition is also more complicated.

Node style callbacks some problems.

We’ve to check for errors with if statements.

Reusing error handlers is harder.

And providing a default handler is also harder.

Promises

Promises are a pattern of async programming where a single result is returned asynchronously.

They’re better than callbacks since they can be chained.

Promises are returned by various functions and serve as the placeholder for the final result.

For instance, a promise chain may be written by writing:

asyncFunction(arg)
  .then(result => {
    console.log(result);
  });

We can have more than one then call if the then callback returns a promise.

For instance, we can write:

asyncFunction(arg)
  .then(result1 => {
    console.log(result1);
    return asyncFunction2(x, y);
  })
  .then(result2 => {
    console.log(result2);
  });

Conclusion

Promises are a better way to write async code in JavaScript.

Categories
Modern JavaScript

Best of Modern JavaScript — Typed Arrays

Since 2015, JavaScript has improved immensely.

It’s much more pleasant to use it now than ever.

In this article, we’ll look at JavaScript typed arrays.

Clamped Conversion

Clamped conversions work differently than module conversion.

JavaScript provides us with constructors with typed arrays that does clamp conversions such as the Uint8ClampedArray constructor.

It works differently from modulo conversion in that all underflowing values are converted to the lowest value.

And all overflowing values are converted to the highest value.

For example, we can create one by writing:

const uint8c = new Uint8ClampedArray(1);

Then if we write:

uint8c[0] = 255;

We get 255 as the value of uint8c[0] .

If we write:

uint8c[0] = 256;

We still get 255.

On the other hand, if we write:

uint8c[0] = 0;

We get 0.

And if we write:

uint8c[0] = -1;

We still get 0.

Endianness

The endianness matters if we store multiple bytes in our typed arrays.

Big-endian means the most significant byte comes first.

So if we have 2 bytes like 0xABCD , then 0xAB comes first and then 0xCD .

Little-endian means the least significant byte comes first.

This means the order of the digits is stored opposite of the way that they’re a store in a big-endian array.

Endianness different between CPU architectures and are consistent across native APIs.

Typed arrays kets us to communicate with those APIs.

This is why their endianness can’t be changed.

The endianness of binary files and protocols are fixed across platforms.

DataViews lets us specify the endianness so that we can transport files and communicate with different protocols across multiple platforms.

Negative Indexes

Negative index can be used with the slice method.

Index -1 means the last element of the typed array.

For instance, we can write:

const arr = Uint8Array.of(0, 1, 2);
const last = arr.slice(-1);

We created an Uint8Array with some numbers.

Then we called slice with -1 to return a typed array with the last entry of arr .

Offsets must be non-negative integers.

So if we pass in a negative number to the DataView.prototype.getInt8 method, we’ll get a RangeError.

ArrayBuffers

ArrayBuffers store the data and views let us read and change them.

To create a DataView, we got to provide the constructor with an ArrayBuffer.

Typed array constructors can optionally create ArrayBuffers for us.

The ArrayBuffer constructor takes a number with the length of the ArrayBuffers.

The ArrayBuffer constructor has the isView static method that returns true if the argument we pass in is a view for an ArrayBuffers.

Only type arrays and DataViews have the [[ViewedArrayBuffer]] internal slot which makes them views.

The ArrayBuffer.prototype.byteLength is an instance method that returns the capacity of the ArrayBuffer in bytes.

It’s a getter method.

ArrayBuffer.prototype.slice(start, end) is an instance method that returns a new ArrayByffer with the index greater than or equal to start and less than end .

start and end can be negative.

Conclusion

Typed arrays can wrap values in various ways.

Also, we’ve to have the correct endianness to store and communicate the data properly.

ArrayBuffers let us store binary data and slice them.