TWIML AI Podcast
TWIML AI Podcast

Recurrence and Attention for Long-Context Transformers with Jacob Buckman - #750

Oct 07, 2025 · 57m

Today, we're joined by Jacob Buckman, co-founder and CEO of Manifest AI to discuss achieving long context in transformers. We discuss the bottlenecks of scaling context length and recent techniques to overcome them, including windowed attention, grouped query attention, and latent space attention. We explore the idea of weight-state balance and the weight-state FLOP ratio as a way of reasoning about the optimality of compute architectures, and we dig into the Power Retention architecture, which blends the parallelization of attention …

Qeybtan weli lama soo gudbin

isticmaal STT.ai in ay ku qoran qaybtan la AI. hel qoraalka saxda ah la hadalka ogaadaan, timestamps, iyo dhoofinta qaabab badan.

Ogaanshaha hadalka Waqtiga erey-ka-dhexdhexaadka ah Soo bixi sida SRT, TXT, JSON

Faallo dheeraad ah