Maxout Generalizes ReLU and Leaky ReLU g(x)=max(a1⊤x+b1,a2⊤x+b2) Increases the number of parameters per neuron