Skip to content

🕷🕸 Deep Neural Network from Scratch

Notifications You must be signed in to change notification settings

momtr/tarantula-nn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TarantulaNN

License: MIT

🕷🕸 Implementation of a Deep Neural Network with flexibles architectures and activation functions


Layers:

  • FullyConnectedLayer

Activation functions:

  • ReLU (*, Rectified Linear Unit)
  • Sigmoid
  • Tanh

Training:

  • mini-batch gradient descent
  • stochastic gradient descent

Cost function:

  • Sum of squares

License

This project is licensed under the terms of the MIT license, see LICENSE.

Releases

No releases published

Packages

No packages published

Languages