22 results for “topic:transformer-model”
Inspired by Andrej Karpathy’s "Let’s Build GPT", this project guides you step‑by‑step to build a GPT from scratch, demystifying its architecture through clear, hands‑on code.
Explainability toolkit for retrieval models. Explain prediction of vector search models (embeddings similarity models, siamese encoders, bi-encoders, dense retrieval models). Debug your vector search models for RAG or agentic AI system.
This project develops and fine-tunes a TimeSeriesTransformer model to forecast EURUSD 5-minute closing prices, serving as a modern counterpart to a baseline LSTM model
Tiny no-string framework for a quick third-party models binding for entities extraction from cells of long tabular data
Educational, from-scratch implementation of a LLaMA-style LLM using PyTorch to explore Transformer architecture fundamentals.
This repository employs two different architectures of Tabular Transformer models for Operating System fingerprinting from three different datasets.
This project demonstrates text summarization using the BART (Bidirectional and Auto-Regressive Transformers) model. BART is a transformer model trained as a denoising autoencoder and is effective for text generation tasks such as summarization.
🌿 GreenSight AI Real-Time Terrain Segmentation for Forest Monitoring GreenSight AI is an advanced semantic segmentation system designed to analyze forest terrain using deep learning. The model segments forest images into 10 detailed terrain classes, enabling automated environmental monitoring and land analysis. Final IoU Score : 0.2638
This repository contains Machine-Translation model for French and English languages
FluxPipeline is a prototype experimental project that provides a framework for working with the FLUX.1-schnell image generation model. This project is intended for educational and experimental purposes only.
An institutional-grade cryptocurrency market intelligence platform powered by a custom Transformer V3 neural network. Delivers 7-day price forecasts, real-time risk analytics, and automated sentiment tracking via a high-performance React 19 & FastAPI architecture.
A deep learning project for generating emotionally expressive music. Using Transformer and GAN models, it enables user-controlled music creation based on specified emotions. Aims to produce realistic, human-like melodies
A multi-label text classification model which can classify the tasks based on the abstract of publication paper.
Blip Image Captioning + GPT-2 Happy Model: Generate joyful responses to image captions using state-of-the-art NLP and computer vision. Pretrained models and data preprocessing included for seamless integration. Explore the intersection of deep learning, sentiment analysis, and language generation
Using just machine learning, can we convert the kanji in Japanese sentences to hiragana?
No description provided.
A light implementation of the 2017 Google paper 'Attention is all you need'.
Deep Learning The Foundation of AI
Deep learning | Built a transformer-based email classifier using BERT to detect spam with 93% accuracy, including preprocessing, class balancing, embeddings, model training, and evaluation.
BioBERT-based text classifier to predict PRETOX-related text (PRETOX_REL vs NO_PRETOX_REL).
🛡 Secure LLM apps by managing untrusted content through a fast, local, model-agnostic pipeline with shared security checks.
Enterprise Text Classification Model Selection Framework Automated decision-support system for selecting optimal transformer models in production text pipelines. Evaluates BERT, DistilBERT, and ELECTRA across accuracy, speed, and cost metrics for finance, healthcare, legal, and customer service applications.