From purely symbolic systems and perceptrons in the 70s' to today's deep learning and LLM architectures, AI has made tremendous progress over the last 50 years. Yet, we believe that scaling is not enough, and new ideas are now needed to advance to the next generation, capable to make a synthesis of the advances of the last decades to combine them into new approaches that are more robust, grounded in interactions and capable of real understanding.
Today's forefront AI research is focusing on advancing key issues and limitations of current deep learning and LLM approaches: understanding of causality, grounding of shared meaning, reasoning and abstraction forming, embodiment, interactions and life-long learning.
At Developmental Labs, we advocate for hybrid approaches, combining the best of neural network technologies with cognitive architectures making use of representations, optimization techniques with hypothesis generation, algorithmic methods and formal approaches.
We also focus on a developmental approach to AI, where AI systems are embodied into a physical and social world and build incremental knowledge by interaction with humans. Language emergence is a cornerstone of our approach: it is a central part of the process by which grounded meaning is socially synced between agents, out-of-distribution goals can be explored, and joint attention and joint plans can be built between agents.
Combine the best of deep learning and algorithmic approaches into a hybrid cognitive architecture.
Train AIs in VR simulations, interacting with humans and building grounded common sense and language.