Continuous Thought Machines

Citation

Authors: Luke N. Darlow et al. Year: 2025 Venue: URL:

Abstract

Modern NNs abstract away temporal neural dynamics. This paper introduces Continuous Thought Machine (CTM) with neuron-level temporal processing and neural synchronization as latent representation.

Summary

A novel architecture leveraging neural timing and synchronization as computational principles, with emergent properties like adaptive computation time.

Key Contributions

  1. Internal tick dimension decoupled from data dimensions
  2. Neuron-level models (NLMs) with private weights
  3. Neural synchronization as latent representation
  4. Emergent adaptive computation

Core Concepts & Definitions

Continuous Thought Machine (CTM)

Architecture with:

  1. Internal tick dimension
  2. Neuron-level models (NLMs): private weights processing activation histories
  3. Neural synchronization: correlation structure as latent representation

Synchronization Matrix

computed from post-activation history, used as latent representation.

Main Results

  1. Solves 2D mazes via internal map formation
  2. Learns to “look around” images before classifying
  3. Native adaptive computation time
  4. Generalizes to longer sequences in parity computation

Relevance to Project

Low — Tangential to main focus:

  • Different architectural approach, not directly about skills
  • Temporal dynamics could be relevant for skill execution modeling
  • Synchronization concept distantly related to skill coordination
  • (Architecture-focused, less connected to main skill literature)