But what are they greatly capable of really, other than just doing that?
I never get my head around with this ES6 feature fully since its inception. So I couldn’t find a lot of other common usages besides its apparent usefulness for handling continuous streams of data that runs asynchronously.
Thus I decided to go and explore a bit out about them, and understand what is their true purpose.
My Google searches led me to this Wikipedia link, which quotes
In computer science, a generator is a special routine that can be used to control the iteration behaviour of a loop. In fact, all generators are iterators. A generator is very similar to a function that returns an array, in that a generator has parameters, can be called and generates a sequence of values.
However, instead of building an array containing all the values and returning them all at once, a generator yields the values one at a time, which requires less memory and allows the caller to get started processing the first few values immediately. In short, a generator looks like a function but behaves like an iterator.
Reading the above paragraph initially sounded a little intimidating to me at first.
But as I scrolled down, found some sample codes that look familiar to me…
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Things started to make more sense to me.
Looking at the Fibonacci series operation, it reminded of an age-old technical interview question which I was asked to produce the similar operation some years ago. (see below)
1 2 3 4 5 6 7 8 9 10 11 12
Both of the Fibonacci algorithms above reached the same end result, but with one key difference.
That difference is about
having greater control over your data collection by manipulating its internal states with the single goal of increasing its robustness and optimal performance.
Hmm… What do I mean by that exactly?
Well. If you ever recall working with for loops like these
You know how familiar are these to you, by heart already.
You’ve seen and, obviously, written these in various places such as fetching a customer sales from a database and rendering their display on the front end using arrays, or an array of images that we want to pre-format their data attributes so we only display the relevant information as a carousel to the users towards the front end for eg.
We think we know enough what’s required to do a good job.
But what if, we were given the extra capability to do an even better job than what it is already?
By doing better job, I mean, solving certain problems that comes with more complex scenarios.
Going back to the for loop logic example above, what if
arr is not simply a trivial data structure
Array type anymore?
arr could be:
Stringtype that you want to iterate all of its character constants
- an infinitely large collection that requires intensive data processing computational times
- a data structure where the data set do not share the same attributes eg a mixture of primitive and object types
To illustrate the first point, let’s say you have the following
And the requirement I want is to print out each letter of the world in sequential order, line by line on the console.
Your first instinct when looking into this is to convert the string into an array, iterate them and print its contents out.
1 2 3 4 5 6 7
As expected, it outputs the letters line by line on the screen.
But what if you can write even better than this?
You can - using ES6 feature
1 2 3 4 5
For..of loop does exactly the same thing as the previous code. The difference is that there’s no array declaration to store and no indexed number to keep track anymore when iterating.
With few lines of code, you can achieve more with less. Especially if you want to write better iterative algorithms.
Though, you may wonder.
What truly works underneath the hood of
for..of loop? What causes this magic to happen?
What I found out and learnt is the
for..of loop construct uses
Symbol.iterator method which is responsible for the iteration behind and, when it gets called, it internally accesses the
next value in the loop and the
done boolean key will tell us whether we’re at the end of the iteration or not. You can find out the explanation in detail here by this author.
After stumbling upon this, I’ve come to my realisation that generators and iterators grant us the flexibility on how we can manage our iterations easier as our iteration needs or requirements may get trickier to implement. Generators and iterators provide us with the protocols in how we set our looping conditions, establishing our breakpoints and determining the outcome of the iteration.
From this, we have the following concepts:
- Iterables - a data structure that allows its data to be consumed.
- Iterators - a process or function to fetch data from some data structure
- Generators - a special function to create iterator factories for your custom iterables and iterators to perform an algorithm that maintains its own state when iterating.
When talking about iterators, going back to the
for..of loop example, its iterables are usually arrays or sets or similar data structure whose values reside in, whilst its iterator is the property that’s responsible for traversing the same data structure or iterables in this case.
Thus we could write loops this way instead of the traditional for loop method, what’s the major benefit of having it in the first place (other than simply cleaner code)?
The main selling point of using them is to reduce high computational costs of data processing for very large datasets during runtime. Without using iterators, you’re forced to load all the data up front and start to process all data sets - from start to finish.
When using iterators, you get to work on the data sets in chunks. When consuming in small chunks one after another, you will be able to manipulate certain portions of the data collection efficiently, without having the additional burden of computational costs.
A good of example would be if you have to fetch say, (using the same customer sales example as before) from a database and you want to load the same data on the screen. But you don’t want to load all of it at once for you could have hundreds, if not thousands, of records coming back from the SQL database query. Therefore you want to process them in chunks, incrementally like so.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
We can also make our own iterators and iterables as well using
Symbol.iterator() factory patterns.
Thus, the most common pattern you could write is the following
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
Once you get more comfortable in writing custom iterators and iterables of your own (and handling well), at some point, you may want to go out and write your own generators that can take control of the internal states of the iteration - of some fashion.
Generators are basically special functions to building iterator factories really. They start with using
function* syntax. With such generator, you would use
yield keyword to yield values you provide, and it takes care of the
done/value pairs checks for you.
Take the same Fibonacci series example we saw earlier…
1 2 3 4 5 6 7
To make an iterator out of this, you will simply declare and instantiate it.
To start off the iteration sequence, you do the following:
What’s pretty amazing about this is that generators allow us to work on some infinitely large collection without worrying about having the single UI thread being locked up. Your app will still continue to operate normally, thus all the rest of its functions/modules can run to completion without issues.
Heck..! You can even declare as many iterators as you want. They won’t pose any threat in locking up your UI browser.
How is this possible??
It’s all because of this little guy here..
1 2 3
yield keyword is the one responsible behind all its ‘magic’.
It gives us the ability to pause or suspend the execution at some point during the iteration call. What this means is that given our infinite array (or data collection, if you will), I instruct my iterator to begin processing some elements in chunks by invoking
next() call. After processing the first chunk, I want it to pause. At this point, I don’t want it to process the next chunk yet. I may decide to do other things important tasks first like interacting with other UI components or similar. Whatever that the other things I gotta do first, the iterator will pause in suspense until I’m explicitly ready to invoke
next() call. Then the following chunk gets processed and paused. Thus the cycle repeats.
Yield remembers at a certain position where you left off during iteration, thus you can think of them as your flagpost - just like as you would set up flagposts on every tree you passed through as you wander through the thick forest, as an analogy.
Therefore, you can think of a number of possible useful applications with generators. This include:
- Infinite pagination links for an infinite list of items
- Social media feeds that keeps on loading non-stop.
- Deeply nested folders to traverse by using better recursion techniques.
- Parallel execution of database calls for the same result set.
- etc, etc
What we witness here is that using iterators and generators provides the following number of advantages
Robust Control over Looping Performance
No longer will you be at the mercy of a loop operation with no uncertainty how it will end cleanly. Using generators/iterators, you take the driver seat. You will tell it whether it goes fast or it goes slow, or even paused altogether.
More optimised code.
Arguably, you will gain cleaner code and can get away with pre-optimized performance.
Deferred executions or Lazy Evaluations
All thanks to
Can handle infinitely boundless data collection of any data structure type
Just have to remember iteration protocols you must set.
Parallel programming and complex computational algorithm
Using your computer science algorithms knowledge, you can actually perform common search algorithms as many as you want it without hogging another computer memory resource to spend.
That’s it. Go out there and give these iterators/generators some love yeah ❤️❤️😘?
Till next time, Happy Coding!