Search⌘ K
AI Features

Advancing Pooling Techniques

Understand how to improve pooling techniques in convolutional neural networks by adaptively selecting distributions and addressing spatial dependencies in feature maps. Learn to apply maximum likelihood estimators and exponential family statistics to optimize pooling for complex datasets, enhancing CNN performance in rare event prediction.

The possibility of fitting distributions uncovered a myriad of pooling statistics. It is also made possible using advanced techniques, such as an adaptive selection of distribution. Such techniques have significance because optimal pooling depends on the characteristics of feature maps in convolutional networks and the dataset.

Automatically determining the optimal pooling is challenging. Moreover, MLEs are learned to be the appropriate pooling statistic. But they’re unavailable for some ...

Object with color gradient
Object with color gradient
Object on a corner
Object on a corner

The former image results in a peaked distribution that can be from a normal, gamma, or Weibull, while the latter results in an exponential-like distribution. If the distribution is known, using MLEs for pooling fits the distribution to the feature map.

This is straightforward with normal, and gamma distribution as closed-form estimators exist for their parameters. For Weibull, MLE is available for the scale λλ ...