- Updated: March 10, 2026
- 1 min read
MrRoPE: Mixed‑radix Rotary Position Embedding – A Deep Dive
MrRoPE: Mixed‑radix Rotary Position Embedding
Rotary Position Embedding (RoPE) extensions have become essential for handling longer sequences in transformer models. In the recent paper MrRoPE: Mixed‑radix Rotary Position Embedding (arXiv:2601.22181v1), the authors propose a unified theoretical framework based on radix‑system conversion, unifying existing RoPE‑extension strategies and introducing two training‑free methods: MrRoPE‑Uni and MrRoPE‑Pro.
Key contributions include:
- A generalized encoding formulation that treats RoPE extensions as radix conversion problems.
- Two practical, training‑free extensions that achieve “train short, test long” generalisation.
- State‑of‑the‑art performance on long‑context benchmarks (e.g., 85% recall on 128K‑context Needle‑in‑a‑Haystack, >2× YaRN accuracy on Infinite‑Bench).
Read the full paper here. For more insights on advanced position embeddings, explore our related articles at ubos.tech/blog.

Stay tuned to ubos.tech for upcoming deep‑dives into cutting‑edge AI research.