GitHunt
MA

MalayAgr/DeepNeuralNetworksFromScratch

Different kinds of deep neural networks (DNNs) implemented from scratch using Python and NumPy, with a TensorFlow-like object-oriented API.

Deep Neural Networks From Scratch

Open in Visual Studio Code DeepSource

This is an implementation of deep neural networks using nothing but Python and NumPy. I've taken up this project to complement the Deep Learning Specialization, offered by Coursera and taught by Andrew Ng.

Currently, the following things are supported:

  • Layers:
    • Dense
    • Conv2D
    • DepthwiseConv2D
    • SeparableConv2D
    • Conv2DTranspose
    • MaxPooling2D
    • AveragePooling2D
    • BatchNorm
    • Dropout
    • Flatten
    • Add
    • Concatenate
  • Activations:
    • Linear
    • Sigmoid
    • Tanh
    • ReLU
    • LeakyReLU
    • ELU
    • Softmax
  • Losses
    • BinaryCrossEntropy
    • CategoricalCrossEntropy
    • MeanSquaredError
  • Optimizers:
    • Vanilla SGD
    • SGD with momentum
    • RMSProp
    • Vanilla Adam
    • Adam with AMSGrad.
  • Learning Rate Decay
    • TimeDecay
    • ExponentialDecay
    • CosineDecay

It is also possible to easily add layers, activations, losses, optimizers and decay algorithms.

Note: There is no automatic differentiation. Users, when extending, need to define the necessary derivatives for backpropagation.

Hope you like it! Happy learning!

Languages

Python100.0%

Contributors

MIT License
Created June 11, 2021
Updated December 9, 2025