XAttention - Block Sparse Attention with Antidiagonal Scoring
Implements efficient block sparse attention mechanism through antidiagonal scoring, achieving 13.5x speedup in long-context Transformer models while maintaining high accuracy across natural language, video understanding, and video generation domains.