Pseudo-Random Number Generation
Consider a function:
No matter how many times we call this function, it will always give the same output for a given input.
On the other hand, consider some random/stochastic function; the seventh roll of the dice, the temperature at 3 PM, or the quantum state of a particle. All of them are never guaranteed to give the same output every time.
The outcome of this random function is known as a random variable. It’s hard to think of a field that doesn’t rely in some way on random variables. It’s no wonder that they play a central role in modeling almost any real-world system.
One of the advantages computers have over manual work is their deterministic behavior: if we run an algorithm under the same conditions, it will always give the same result.
This advantage can sometimes turn out to be a disadvantage as well, though. Computer scientists have been aware of this trait for a long time and have thus devised algorithms known as Pseudo-Random Number Generation (PRNG).
John Von Neumann, who proposed one of the earliest PRNG algorithms, cautioned in his famous quote:
“Anyone who considers arithmetical methods of producing random digits is, of course, in a state of sin.” (1951)
One of the most commonly used PRNG, Mersenne Twister (named after the Mersenne primes it uses), is widely used in languages and libraries like C++, R, CUDA, NumPy, and more.
Despite its broad implementation, Mersenne Twister has some issues.
Counter-based PRNGs, which we’ll review shortly, address the shortcomings of Mersenne Twister and other conventional PRNGs. In 2011, D.E.Shaw Research introduced the Threefry counter, which is the baseline behind the JAX implementation.
We’ll review the JAX implementation (referred to going forward as JAX PRNG) features.