Introduction to Shrinking

Get introduced to shrinking and the basic concepts of how to manipulate the process in PropEr.

We'll cover the following

Shrinking

A critical component of property-based testing is shrinking. Shrinking is the mechanism by which a property-based testing framework can be told how to simplify failure cases enough to let us figure out exactly what the minimal reproducible case is. While finding complex obtuse cases is worthwhile, being able to reduce failing inputs to a simple counterexample truly is the killer feature.

But there are some cases where PropEr can not do what we need. Either it can’t shrink large data structures well enough to be understandable, or it’s not shrinking them the way we want it to. In this chapter, learn about two different macros to help us handle things:

  1. shrink
  2. let_shrink

These macros help us give the framework hints about what to do. But first, we have to see how shrinking works at a high level.

How shrinking works

In general, we can think of shrinking as the framework attempting to bring the data generator closer to its own zero-point, and successfully doing so as long as the property fails. Zero for a generator is somewhat arbitrary, but if we play with the default generators a bit by calling PropCheck.sample_shrink on them in the shell, notice the following:

  • A number tends to shrink from floating-point values toward integers, and integers tend to shrink toward the number 0.
  • Binaries tend to shrink from things full of bytes toward the empty binary.
  • Lists tend to shrink toward the empty list.
  • elements([A,B,C]) will shrink toward the value A.

In short, data structures that contain other data tend to empty themselves, and other values try to find a neutral point. The nice aspect of this is that as custom generators are built from other generators, the shrinking is inherited, and a custom generator can get its own shrinking for free. For example, a map full of people’s records made of strings and numbers will see the strings get shorter and simpler, the numbers will get closer to zero, and the map will retain fewer and fewer elements until only the components essential to trigger a failure are left.

But for some data types, there is no good zero-point. A vector of length 15 will always have length 15, and the same with a tuple. Similarly, larger recursive data structures that have been defined by the user may not have obvious ways to shrink (such as probabilistic ones), or may require shrinking toward values other than the default for a generator. For example, we can picture a chessboard, which is at its neutral point not when it’s empty, but when it’s full, with all its pieces in their initial positions.

These cases, even if they are rare, require the shrink macro.

Get hands-on with 1200+ tech skills courses.