21 results for “topic:modernbert”
Hallucination-prevention RAG system with verbatim span extraction. Ensures all generated content is grounded in source documents with exact citations.
Easy modernBERT fine-tuning and multi-task learning
ModernBERT model optimized for Apple Neural Engine.
ClassyText is a demo for zero-shot text classification using ModernBERT-large from Hugging Face.
Open, Lightweight Model for AI Safety.
End-to-end pipeline that identifies specialized research papers through automated classification, demonstrated with an LLMOps use case that includes data ingestion, model training, evaluation, and deployment.
Code for EXIST 2025 Task 1.1, 1.2 and 1.3
Challenge to distinguish whether a sentence from a news article expresses the subjective view of the author behind it or presents an objective view on the covered topic
High-performance, edge-native compliance engine for the Fair Housing Act (FHA). Powered by ModernBERT, providing privacy-first local inference to detect real estate violations in real-time.
Trajectory Classification with the new architecture of BERT, the 'ModernBERT'
A finetuned ModernBERT model for named entity recognition (NER), trained on the CoNLL-2003 dataset to identify persons, organizations, locations, and miscellaneous entities in english text
An example workflow for fine-tuning ModernBERT for a classification task using the IMDB dataset.
This is my attempt at writing an ai detector api by fine-tuning modernBert. The story about this project is discussed in the readme. You can find a link to the model playground below.
Data Processing Utilities And Training Code for r/changemyview Dataset
Fine-tuned ModernBERT for software industry related article summaries
Detect duplicate & unused Python code via AST hashing, Jaccard similarity, and semantic embeddings (ModernBERT, C2LLM, EmbeddingGemma). CLI + Python API w hybrid synthesis
RAG system with a fine-tuned ModernBERT financial embedding model (Matryoshka), LangGraph query routing, FastAPI backend, Next.js Frontend, and Docker
We introduce a binarized approach to Lexical Complexity Prediction (Binary LCP) and systematically compare two generations of encoder-only Transformer models: BERT and ModernBERT. Work completed as part of Natural Language Processing, DATASCI 266.
No description provided.
ModChemBERT: ModernBERT as a Chemical Language Model
This repository contains code for benchmarking ModernBERT, RoBERTa, and OPT-350m on multi-class emotion classification using 8-bit quantization, backbone freezing, and LoRA-based PEFT.