Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added support for ELU activation units with Caffe #34

Merged
merged 1 commit into from
Jan 3, 2016
Merged

added support for ELU activation units with Caffe #34

merged 1 commit into from
Jan 3, 2016

Conversation

beniz
Copy link
Collaborator

@beniz beniz commented Nov 29, 2015

Support for ELU, using PR BVLC/caffe#3388
In API, use activation:"elu" and elu_alpha to specify the value of the saturation control variable alpha.

Original paper:
Clevert, D.-A., Unterthiner, T., & Hochreiter, S. (2015). Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs). http://arxiv.org/abs/1511.07289

@beniz beniz self-assigned this Nov 29, 2015
beniz added a commit that referenced this pull request Jan 3, 2016
added support for ELU activation units with Caffe
@beniz beniz merged commit 66ff4af into master Jan 3, 2016
@beniz beniz deleted the elu branch September 16, 2016 05:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant