Skip to content
This repository was archived by the owner on Nov 17, 2023. It is now read-only.

[RFC] Adds ELU to LeakyReLU activation layer #718

Merged
merged 1 commit into from
Nov 25, 2015

Conversation

vchuravy
Copy link
Contributor

Adds the ELU operator introduced in [1]. This is the first operator I added so I would welcome a thorough review.

[1] Clevert D-A, Unterthiner T, Hochreiter S. Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs). arXiv [cs.LG]. 2015. Available: http://arxiv.org/abs/1511.07289

Adds the ELU operator introduced in D.-A. Clevert, T. Unterthiner, and S. Hochreiter (2015)
@antinucleon
Copy link
Contributor

I think ELU should be put at https://github.com/dmlc/mxnet/blob/master/src/operator/activation-inl.h , and we can put leaky ELU here.

@antinucleon
Copy link
Contributor

I double checked and please ignore my last message. I posted when I was just awake. This LGTM

antinucleon added a commit that referenced this pull request Nov 25, 2015
[RFC] Adds ELU to LeakyReLU activation layer
@antinucleon antinucleon merged commit 3375f7a into apache:master Nov 25, 2015
@vchuravy
Copy link
Contributor Author

The PR for Caffe has some changes that are worth keeping an eye on: BVLC/caffe#3388

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants