Skip to content
AI
March 12, 20261 min read0 views

Google DeepMind Unveils Titans + MIRAS: Revolutionizing AI with True Long-Term Memory

TripleG News

TripleG News

Mar 12, 2026

Google Research announced Titans and MIRAS on March 12, 2026, introducing a novel architecture and theoretical framework to grant AI models genuine long-term memory capabilities. Titans features a neural long-term memory module built as a multi-layer perceptron, far more expressive than traditional fixed-size vectors in RNNs. It actively learns relationships and themes across inputs, prioritizing 'surprise metrics'—unexpected or novel information inspired by human psychology—to store only what's truly significant, enabling AI to synthesize entire narratives rather than just note-taking.

MIRAS complements Titans by providing a unified framework for sequence modeling, breaking it down into four key design choices: memory architecture, attentional bias, retention gate, and memory algorithm. This allows for innovative, attention-free models like YAAD, MONETA, and MEMORA, which use advanced techniques such as Huber loss for robustness against outliers. In benchmarks, Titans outperformed leading architectures like Transformers, Mamba2, and even larger models such as GPT-4 and Llama3.1-70B on long-context tasks, including the 'Needle in a Haystack' test (over 95% accuracy) and BABILong, while handling contexts exceeding 2 million tokens with just 760 million parameters.

This breakthrough matters because it addresses a core AI limitation: forgetting context over long sequences, which hampers applications like contract analysis, customer interactions, and technical research. By enabling continuous learning during inference without retraining or quadratic computational costs, Titans + MIRAS boosts scalability and efficiency, blending RNN speed with Transformer expressiveness.

Looking ahead, these advancements open doors to a new era of long-context AI, potentially integrating into production models for real-world tasks like genome modeling and time series forecasting. Google envisions broader adoption, sparking further innovations in memory-optimized architectures for startups and enterprises.

Stay Ahead of the Curve

Join 10,000+ tech enthusiasts

Weekly digest · Curated picks · No spam

Related Articles