80 results for “topic:l2-regularization”
A deep learning project using fine-tuned RoBERTa to classify mental health sentiments from text, aiming to provide early insights and support. ⚕️❤️
Фреймворк для построения нейронных сетей, комитетов, создания агентов с параллельными вычислениями.
Logistic Regression technique in machine learning both theory and code in Python. Includes topics from Assumptions, Multi Class Classifications, Regularization (l1 and l2), Weight of Evidence and Information Value
Short description for quick search
The given information of network connection, model predicts if connection has some intrusion or not. Binary classification for good and bad type of the connection further converting to multi-class classification and most prominent is feature importance analysis.
Analysis of the robustness of non-negative matrix factorization (NMF) techniques: L2-norm, L1-norm, and L2,1-norm
Regularized Logistic Regression
Modifiable neural network
Wrapper on top of liblinear-tools
Water and lipid signal removal in MRSI by L2 regularization (submitted by Liangjie Lin)
Centralized Disaster Response and Inventory Management System that leverages AI and Google Cloud Technologies to predict disasters, optimize resource management, and provide real-time coordination.
Implemented a neural network from scratch in Python with just NumPy, no frameworks involved.
Curso Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization. Segundo curso del programa especializado Deep Learning. Este repositorio contiene todos los ejercicios resueltos. https://www.coursera.org/learn/neural-networks-deep-learning
An OOP Deep Neural Network using a similar syntax as Keras with many hyper-parameters, optimizers and activation functions available.
No description provided.
A "from-scratch" 2-layer neural network for MNIST classification built in pure NumPy, featuring mini-batch gradient descent, momentum, L2 regularization, and evaluation tools — no ML libraries used.
The module allows working with simple neural networks (Currently, the simplest model of a multilayer perceptron neural network with the backpropagation method and the Leaky ReLu activation function is used).
A simple python repository for developing perceptron based text mining involving dataset linguistics preprocessing for text classification and extracting similar text for a given query.
Simple Demo to show how L2 Regularization avoids overfitting in Deep Learning/Neural Networks
PyTorch implementation of important functions for WAIL and GMMIL
No description provided.
A framework for implementing convolutional neural networks and fully connected neural network.
The aim was to create and implement a predictive model that can forecast the number of items sold for a period of 8 weeks ahead.
Deep Learning Course | Home Works | Spring 2021 | Dr. MohammadReza Mohammadi
Repository for Assignment 1 for CS 725
Fully connected neural network with Adam optimizer, L2 regularization, Batch normalization, and Dropout using only numpy
Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
Image Classification with CNN using Tensorflow backend Keras on Fashion MNIST dataset
This is a repository with the assignments of IE675b Machine Learning course at University of Mannheim.
During this study we will explore the different regularisation methods that can be used to address the problem of overfitting in a given Neural Network architecture, using the balanced EMNIST dataset.