Harry R. Schwartz

Software engineer, nominal scientist, gentleman of the internet. Member, ←Hotline Webring→.

Generating Unique Tokens with Lazy Enumerators

Published 30 Sep 2015. Tags: ruby.

My day job involves a bunch of web development. I frequently work on APIs, so sometimes I need to generate unique authentication tokens. It’s easy to generate a token, but we probably don’t want to accidentally assign the same token to two different users. How can we ensure that we’re generating unique tokens?

Suppose we have some code to generate a token:

def new_token(length)
SecureRandom.hex(length)
end


Further suppose that we’ve got a unique_token?(token) predicate that’ll tell us if our token’s already in use.

In that case there’s an obvious stateful implementation that we could use:

def unique_token(length: DEFAULT_TOKEN_LENGTH)
token = new_token(length)
while !unique_token?(token)
token = new_token(length)
end
token
end


This totally works, and I wouldn’t be especially upset to find it in a codebase. It feels a little clunky, though: we have to call new_token in two places, and we’re frequently updating a local variable. Maybe there’s a more functional way to solve this problem?

Let’s think about what we’re doing here: we want to keep generating tokens until we find one that’s unique. Conceptually, we’re searching through an infinite list of tokens for one that we haven’t seen before. We can generate just such a list in Ruby by creating an Enumerator:

def lazy_token_enumerator(length)
Enumerator.new do |yielder|
loop do
yielder.yield(new_token(length))
end
end
end


Enumerator’s constructor takes a block that defines the resulting object’s #each method. So every time we call #each on the enumerator we’ll generate a new token. We’ve effectively created an endless, lazily generated stream of tokens.

Enumerator implements Enumerable, so we get all the fancy higher-order methods that come with that module. One of those is #detect, which we can use to find the first unique token:

def unique_token(length: DEFAULT_TOKEN_LENGTH)
lazy_token_enumerator(length).
detect { |token| unique_token?(token) }
end


So there we go! Unique tokens generated without using local variables.

This concept of searching an infinite sequence generalizes to any situation where we need to generate objects until we find one with a given property. Lazy enumerators are awfully useful tools.