EvoNorm Layers in TensorFlow 2

Sayak Paul

In this report, I am going to lament on my experiments with the EvoNorm layers proposed in Evolving Normalization-Activation Layers. In the paper, the authors attempt to unify the normalization layers and activation functions into a single computation graph. The authors claim -

Several of these layers enjoy the property of being independent from the batch statistics.

I used Colab to perform my experiments. The authors tested the EvoNorm layers on MobileNetV2, ResNets, MnasNet, and EfficientNets. I decided to try out some quick experiments on a Mini Inception architecture as shown in this blog post. I trained them on the CIFAR10 dataset.

-->Live Dashboard and set up

-->GitHub repo to reproduce results

Join our mailing list to get the latest machine learning updates.