Continuous Thought Machines
Citation
Authors: Luke N. Darlow et al. Year: 2025 Venue: URL:
Abstract
Modern NNs abstract away temporal neural dynamics. This paper introduces Continuous Thought Machine (CTM) with neuron-level temporal processing and neural synchronization as latent representation.
Summary
A novel architecture leveraging neural timing and synchronization as computational principles, with emergent properties like adaptive computation time.
Key Contributions
- Internal tick dimension decoupled from data dimensions
- Neuron-level models (NLMs) with private weights
- Neural synchronization as latent representation
- Emergent adaptive computation
Core Concepts & Definitions
Continuous Thought Machine (CTM)
Architecture with:
- Internal tick dimension
- Neuron-level models (NLMs): private weights processing activation histories
- Neural synchronization: correlation structure as latent representation
Synchronization Matrix
computed from post-activation history, used as latent representation.
Main Results
- Solves 2D mazes via internal map formation
- Learns to “look around” images before classifying
- Native adaptive computation time
- Generalizes to longer sequences in parity computation
Relevance to Project
Low — Tangential to main focus:
- Different architectural approach, not directly about skills
- Temporal dynamics could be relevant for skill execution modeling
- Synchronization concept distantly related to skill coordination
Related Papers
- (Architecture-focused, less connected to main skill literature)