10 results for “topic:layernorm”
Root Mean Square Layer Normalization
Code for the paper "On the Expressivity Role of LayerNorm in Transformers' Attention" (Findings of ACL'2023)
Implement layer normalization GRU in pytorch
Some common CUDA kernel implementations (Not the fastest).
Unveiling the Layers: Neural Networks from first principles
MNIST Digit Prediction using Batch Normalization, Group Normalization, Layer Normalization and L1-L2 Regularizations
WGAN with feedback from discriminator& LayerNorm instead of BatchNorm
Implemented a custom LayerNorm forward and backward pass extension in C++ using LibTorch and validated numerical equivalence with PyTorch.
Fundamentals of Artificial Intelligence and Deep Learning Frameworks
High-performance CUDA implementation of LayerNorm for PyTorch achieving 1.46x speedup through kernel fusion. Optimized for large language models (4K-8K hidden dims) with vectorized memory access, warp-level primitives, and mixed precision support. Drop-in replacement for nn.LayerNorm with 25% memory reduction.