Positional Encoding

Learn the benefits of using positional encoding in neural fields and how to implement them.

Overview

The positional encoding technique has become integral to the transformer architecture in recent years and has also proved critical to neural radiance fields. This technique provides a powerful and expressive way to embed position information into inputs and overcome the frequency bias in neural networks. We will delve into several areas where positional encodings are employed and then introduce a basic implementation used in neural field research.

Positional encoding in transformers

Positional encoding is a technique integral to the performance of large-scale transformer models. This technique was popularized by the transformer models, which have achieved state-of-the-art performance on a wide variety of machine learning tasks.

The landmark paper “Attention Is All You Need” introduced the transformer network architecture, leveraging the powerful attention mechanism to model sequential data with drastically greater power and faster training times than the current state-of-the-art models based on recurrent or convolutional networks. This paper also introduces the positional encoder as a means to solve a critical problem with transformer models: the lack of spatial information.

Recurrent and convolutional kernels have a spatial formulation that enables neurons to incorporate relational information between inputs. While considering long-range information can come in handy in text data, such as maintaining consistency across long documents, short-range information is also important to understand sentence structure and grammatical context. The positional encoding technique injects absolute position data into input embeddings so that transformer models can access a position signal. While positional embeddings can be learned, the authors found that using fixed positional encodings is just as effective. They propose the following formulation for their positional encoding function PE:

Get hands-on with 1400+ tech skills courses.