The a16z Show
The a16z Show

What's Missing Between LLMs and AGI - Vishal Misra & Martin Casado

Mar 17, 2026 · 47m

<p>Vishal Misra returns to explain his latest research on how LLMs actually work under the hood. He walks through experiments showing that transformers update their predictions in a precise, mathematically predictable way as they process new information, explains why this still doesn't mean they're conscious, and describes what's actually required for AGI: the ability to keep learning after training and the move from pattern matching to understanding cause and effect.</p> <p> </p> <p>Resources:</p> <p>Follow Vishal Misra on X: <a href="https://x.com/vishalmisra" …

이 에피소드는 아직 녹음되지 않았습니다

STT.ai을 사용하여 AI로 이 에피소드를 기록합니다. 발음기 감지, 타임스탬프, 다양한 형식으로 내보내기를 통해 정확한 텍스트를 얻으십시오.

스피커 감지 단어 수준 시간 스탬프 SRT, TXT, JSON으로 내보내기

더 많은 에피소드