Skip to content

Activation Functions  #94

Open
Open
@balapriyac

Description

@balapriyac

Write code the following Activation Functions

Be sure to include the plots and elaborate on the advantages and disadvantages
(vanishing gradient, dead neuron problem etc for example)

Sigmoid, ReLU, Leaky ReLU, tanh, Swish,Mish

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions