Loading contributions...
Top Repositories
In this project, I worked with a small corpus consisting of simple sentences. I tokenized the words using n-grams from the NLTK library and performed word-level and character-level one-hot encoding. Additionally, I utilized the Keras Tokenizer to tokenize the sentences and implemented word embedding using the Embedding layer. For sentiment analysis
This code builds a Pipeline training model in PyTorch to classify breast cancer whether positive or negative. It preprocesses the data using standard scaling and label encoding, then trains the model using manual weight updates and binary cross-entropy loss. Finally, it evaluates the model on the test set and prints the accuracy.
Binary classification of breast cancer using PyTorch. Used StandardScaler, LabelEncoder, Dataset, DataLoader, custom nn.Module model, BCELoss, and SGD. Focused on implementing a complete training pipeline, not optimizing accuracy.
This repository contains code to train and evaluate a neural network model on a subset of the Fashion MNIST dataset using PyTorch. The model achieves remarkable accuracy after 100 epochs of training.
This Optuna-based hyperparameter optimization study performs both static and dynamic tuning of hyperparameters for machine learning models (SVM, RandomForest, and GradientBoosting) to maximize accuracy. It tracks and analyzes model performance, displays the best trial results, and compares the average performance of each classifier.
This project demonstrates the implementation of a Convolutional Neural Network (CNN) using PyTorch, designed to classify fashion items from the Fashion MNIST dataset. The model was trained using GPU acceleration to speed up computation.
Repositories
18No description provided.
This code builds a Pipeline training model in PyTorch to classify breast cancer whether positive or negative. It preprocesses the data using standard scaling and label encoding, then trains the model using manual weight updates and binary cross-entropy loss. Finally, it evaluates the model on the test set and prints the accuracy.
Binary classification of breast cancer using PyTorch. Used StandardScaler, LabelEncoder, Dataset, DataLoader, custom nn.Module model, BCELoss, and SGD. Focused on implementing a complete training pipeline, not optimizing accuracy.
This repository contains code to train and evaluate a neural network model on a subset of the Fashion MNIST dataset using PyTorch. The model achieves remarkable accuracy after 100 epochs of training.
This Optuna-based hyperparameter optimization study performs both static and dynamic tuning of hyperparameters for machine learning models (SVM, RandomForest, and GradientBoosting) to maximize accuracy. It tracks and analyzes model performance, displays the best trial results, and compares the average performance of each classifier.
This project demonstrates the implementation of a Convolutional Neural Network (CNN) using PyTorch, designed to classify fashion items from the Fashion MNIST dataset. The model was trained using GPU acceleration to speed up computation.
This project uses transfer learning with a pre-trained VGG16 model to classify Fashion MNIST images. The convolutional layers are frozen, and a custom classifier is added. The model is fine-tuned using the Adam optimizer and CrossEntropyLoss, with training and evaluation loops running on GPU for efficient processing.
This project applies transfer learning using a pretrained AlexNet model to classify FashionMNIST images. The model was fine-tuned and trained on GPU after necessary preprocessing. It achieved 93.16% accuracy on the test set and 95.87% on the training set.
A simple QA model using RNNs to predict answers from custom question-answer pairs. Includes text preprocessing, vocabulary building, and PyTorch training. Ideal for NLP beginners and chatbot development. 🚀 #PyTorch #NLP #RNN #AI
This repository implements a GPU-accelerated next-word prediction model using PyTorch and LSTM. It includes data preprocessing with NLTK, vocabulary creation, training on tokenized text, and generating text predictions, starting from a given input phrase.
In this project, I worked with a small corpus consisting of simple sentences. I tokenized the words using n-grams from the NLTK library and performed word-level and character-level one-hot encoding. Additionally, I utilized the Keras Tokenizer to tokenize the sentences and implemented word embedding using the Embedding layer. For sentiment analysis
We utilized the VGG16 model for filter visualization, class activation heatmaps, and the Grad-CAM algorithm. We implemented functions to fetch and manipulate Numpy output values, generate filter visualizations, and create grids of filter response patterns. Through these techniques, we gained insights into the model's learned patter
CatandDogsWithAugmentation
No description provided.
No description provided.
No description provided.
No description provided.
No description provided.