7 results for “topic:nucleus-sampling”
No description provided.
Builds N-gram language modes and applies text generation.
Tensorflow 2 implementation of TransformerXL for language modeling
Seq2Seq model implemented with pytorch, using temperature sampling, top-k sampling, top-p sampling / nucleus sampling, and beam search.
The LLM Defense Framework enhances large language model security through post-processing defenses and statistical guarantees based on one-class SVM. It combines advanced sampling methods with adaptive policy updates and comprehensive evaluation metrics, providing researchers and practitioners with tools to build more secure AI systems.
Research-grade medical AI chatbot combining BioBERT embeddings with a custom multi-layer Transformer decoder. Provides interactive Q&A for doctor–patient style queries via Streamlit, supporting Beam Search and Nucleus Sampling. Designed for educational and research purposes.
A short TeX paper formalizing the “Anthem” decoding recipe (temp=0.75, top_k=50, top_p=0.95, min_p=0.05) and explaining why pairing it with a strong persona/system prompt produces coherent, agentic “thinking-being” outputs.