3

I know those activations differ in their definition, however, when reading ReLU's documentation, it takes a parameter alpha as an input with 0 as default, and says

relu

relu(x, alpha=0.0, max_value=None) Rectified Linear Unit.

Arguments

x: Input tensor. alpha: Slope of the negative part. Defaults to zero. max_value: Maximum value for the output. Returns

The (leaky) rectified linear unit activation: x if x > 0, alpha * x if x < 0. If max_value is defined, the result is truncated to this value.

And there is also a LeakyReLU with a similar documentation, but as part of other module (advanced activation)

Is there a difference between them? and how shoud I import relu to instantiate it with alpha?

from keras.layers.advanced_activations import LeakyReLU
..
..
model.add(Dense(512, 512, activation='linear')) 
model.add(LeakyReLU(alpha=.001)) # using Relu insted of LeakyRelu

Note that when using LeakyReLU I'm getting the following error:

AttributeError: 'LeakyReLU' object has no attribute '__name__'

but when I use ReLU instead, It works:

model.add(Activation('relu')) # This works correctly but can't set alpha

To sum up: What are de diferencies and how can I import ReLU to pass aplha to it?

2
  • 1
    Can you post the trace for the error? I can't reproduce it from the model snippet, LeakyReLU works properly.
    – nuric
    Commented May 29, 2018 at 17:08
  • Sorry, I didn't post the hole error because my goal was to underestand the difference between those implementations and I want to call ReLU with alpha parameter, even when I could call LeakyReLU because of the error (I mean, the error was a motivation to avoid using LeakyReLU). The error occurs when DeadReluDetector is used, but it can be a bug of this module (I should investigate more on this) Commented May 29, 2018 at 17:37

1 Answer 1

2

As far as implementation is concerned they call the same backend function K.relu. The difference is that relu is an activation function whereas LeakyReLU is a Layer defined under keras.layers. So the difference is how you use them. For activation functions you need to wrap around or use inside layers such Activation but LeakyReLU gives you a shortcut to that function with an alpha value.

4
  • Thanks for your answer, It's really clear, but How should I import/use ReLU to set alpha value? Commented May 29, 2018 at 17:32
  • Use LeakyReLU as a layer, it wraps the relu function for you. As I commented, I can't reproduce your error. Your usage is correct.
    – nuric
    Commented May 29, 2018 at 17:35
  • yes, the error is in another module (DeadReluDetector, from contrib) So I'm not sure where the error is, but at this moment I don't care, I just want to use ReLU while setting alpha value Commented May 29, 2018 at 17:42
  • Also note that when using the standard 'relu' activation in a dense layer or something, the alpha stays at 0 and is not 'trainable'/
    – modesitt
    Commented May 31, 2018 at 18:31

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Not the answer you're looking for? Browse other questions tagged or ask your own question.