#326 Zuzanna Stamirowska: Inside Pathway's Post-Transformer Architecture Designed for Memory and On-the-Fly Learning - Eye On A.I. Recap
Podcast: Eye On A.I.
Published: 2026-03-11
Duration: 1 hr 8 min
Guests: Zuzanna Stamirowska
Summary
Zuzanna Stamirowska discusses Pathway's new architecture that integrates memory and reasoning into AI, challenging the limitations of traditional transformer models.
What Happened
Zuzanna Stamirowska, CEO of Pathway, discusses the limitations of traditional transformer models and introduces Pathway's post-transformer architecture designed to incorporate memory and on-the-fly learning. She compares the traditional AI model to an intern who never learns beyond the first day, emphasizing that current AI systems lack the ability to improve over time due to their static nature.
Stamirowska explains the architecture of Pathway's model, which is based on graph-like structures rather than traditional layers. This model allows for local interactions between neurons, mimicking the brain's synapses and enabling more efficient learning and reasoning. The system's design allows it to retain information through strengthened synaptic connections, offering a novel approach to solving the problem of catastrophic forgetting.
The episode delves into the potential of the Baby Dragon Hatchling (BDH) architecture to support reasoning and creativity by creating a more dynamic and adaptable AI system. Stamirowska believes this can address the challenge of AI hallucinations by maintaining consistent reasoning over extended periods.
Pathway's approach draws on principles from complexity science and neuroscience, aiming to create a system where the function dynamically shapes the network, much like organic systems in nature. This architecture supports the idea of emergent properties, which could lead to breakthroughs in AI reasoning and innovation.
The potential applications of this architecture are significant, particularly in areas requiring complex reasoning with limited data. Stamirowska highlights its suitability for tasks needing continual learning and adaptability to changing environments, such as healthcare claims resolution.
Partnerships with companies like Nvidia and AWS are facilitating the development and potential commercialization of this architecture. The first models are expected to be available to AWS customers, promising enhanced reasoning capabilities and efficiency in AI tasks.
Key Insights
- Pathway's post-transformer architecture mimics the brain's synaptic connections, allowing for local interactions between neurons and more efficient learning. This approach addresses the issue of catastrophic forgetting by retaining information through strengthened synaptic connections.
- The Baby Dragon Hatchling (BDH) architecture supports reasoning and creativity by maintaining consistent reasoning over extended periods, potentially reducing AI hallucinations. This dynamic adaptability is achieved by allowing the function to shape the network, similar to organic systems in nature.
- Pathway's architecture is particularly suitable for tasks requiring continual learning and adaptability to changing environments, such as healthcare claims resolution. Its design supports complex reasoning with limited data, making it a valuable tool for industries facing dynamic challenges.
- Partnerships with Nvidia and AWS are aiding the development and commercialization of Pathway's architecture. The first models are expected to be available to AWS customers, promising enhanced reasoning capabilities and efficiency in AI tasks.