Contemporary discussions surrounding the nature of consciousness frequently become polarized, with two dominant perspectives often clashing. One viewpoint, termed computational functionalism, posits that thought processes can be entirely characterized as abstract information processing. Proponents of this idea argue that any system possessing the requisite functional organization, irrespective of its material composition, should inherently give rise to consciousness. Conversely, biological naturalism asserts a diametrically opposed stance, contending that consciousness is inextricably linked to the unique characteristics of living brains and bodies. This perspective views biology not merely as a vessel for cognition, but as an integral component of the cognitive process itself. While both these viewpoints offer valuable insights, their persistent disagreement suggests that a crucial element remains absent from the prevailing discourse.
Addressing this intellectual impasse, a novel theoretical framework, which the authors provocatively label "biological computationalism," is introduced. This approach seeks to refine and sharpen the ongoing debate by fundamentally questioning the adequacy of the standard computational paradigm for describing the intricate workings of biological brains. For a considerable period, the prevailing inclination has been to conceptualize the mind as analogous to software operating on neural hardware, envisioning the brain as performing computations akin to those of conventional digital computers. However, this analogy falters under scrutiny, as real brains do not adhere to the architectural principles of von Neumann machines. The attempt to impose this comparison leads to the proliferation of imprecise metaphors and ultimately, to explanations lacking robustness. To achieve a meaningful understanding of how brains compute and what would be necessary to replicate minds in alternative physical substrates, a more expansive definition of "computation" is imperative.
Biological computation, as delineated by this new framework, is characterized by three fundamental attributes.
The first defining characteristic is its hybrid nature, seamlessly integrating discrete events with continuous dynamical processes. Neuronal firing, the release of neurotransmitters at synapses, and the shifts in network states all represent event-like occurrences. Simultaneously, these events are embedded within and influenced by perpetually evolving physical conditions, including fluctuating voltage fields, chemical gradients, the diffusion of ions, and time-varying electrical conductances. The brain, therefore, cannot be accurately classified as purely digital, nor can it be solely described as an analog machine. Instead, it operates as a complex, multi-layered system where continuous physical processes exert influence over discrete events, and conversely, discrete events continuously reconfigure the underlying physical landscape in a perpetual feedback loop.
The second pivotal feature of biological computation is its scale-inseparability. In traditional computing, a clear demarcation often exists between software and hardware, or between a "functional level" and an "implementation level." The brain, however, defies such neat partitioning. There is no distinct boundary that allows for the isolation of an algorithm from its physical instantiation. Causal relationships propagate across multiple scales simultaneously, encompassing phenomena from individual ion channels and dendritic structures to intricate neural circuits and the dynamics of the entire brain. These levels do not function as independent modules arranged in discrete layers. Within biological systems, modifications to the "implementation" invariably alter the "computation," underscoring the profound interdependence of the two.
Thirdly, biological computation is metabolically grounded. The brain operates under stringent energetic constraints, and these limitations profoundly shape its structure and functional repertoire across all levels. This is not merely a secondary engineering consideration; rather, these energy constraints actively influence the kinds of information the brain can represent, its capacity for learning, the stability of particular patterns, and the mechanisms by which information is coordinated and routed. From this perspective, the pervasive and tight coupling observed across different scales is not an artifact of accidental complexity but represents an evolutionary strategy for energy optimization, enabling robust and adaptable intelligence within severe metabolic limitations.
Collectively, these three characteristics lead to a conclusion that may appear counterintuitive to those accustomed to classical computing paradigms. Computation within the brain is not reducible to the abstract manipulation of symbols, nor is it solely about the movement of representations governed by formal rules while the physical medium is relegated to a mere "implementation." In the context of biological computation, the algorithm and the substrate are inextricably linked. The physical organization does not simply facilitate computation; it constitutes the very essence of what computation is. Brains do not execute programs in the conventional sense; rather, they embody specific physical processes that compute by unfolding over time.
This re-conceptualization of computation also illuminates a significant limitation in the common descriptions of contemporary artificial intelligence. Even highly sophisticated AI systems primarily engage in function simulation. They excel at learning mappings between inputs and outputs, often exhibiting remarkable generalization capabilities. However, the underlying computation remains a digital procedure executed on hardware designed for a fundamentally different mode of computation. In stark contrast, brains perform computation within the constraints of physical time. Continuous fields, ionic flows, dendritic integration, localized oscillatory coupling, and emergent electromagnetic interactions are not peripheral biological "details" that can be disregarded when abstracting an algorithm. Instead, these are the fundamental computational primitives of the system, the very mechanisms that enable real-time integration, resilience, and adaptive control.
This perspective does not necessitate the conclusion that consciousness is exclusively confined to carbon-based life forms. The argument is not one of "biology or nothing." The assertion is more precise and practical: if consciousness, or cognition akin to mind, is contingent upon this form of computation, then it may require a computational organization that mirrors biological processes, even if it is instantiated in novel substrates. The critical factor is not the literal biological nature of the substrate, but rather whether the system embodies the requisite hybrid, scale-inseparable, and energetically grounded computational characteristics.
This reframing profoundly alters the objectives for those endeavoring to construct synthetic minds. If brain computation cannot be disentangled from its physical realization, then the mere scaling of digital AI may prove insufficient. This limitation arises not from a deficiency in the potential capabilities of digital systems, but because raw capability represents only one facet of the challenge. A more profound concern is the potential for optimizing the wrong aspects by enhancing algorithms while leaving the fundamental computational ontology unchanged. Biological computationalism posits that the creation of truly mind-like systems may necessitate the development of new classes of physical machines whose computation is not structured as software operating on hardware, but rather as a distributed, dynamically interconnected process shaped by the realities of real-time physics and energy constraints.
Consequently, if the aspiration is to achieve synthetic consciousness, the central inquiry may shift from "What algorithm should we implement?" to "What kind of physical system is required for that algorithm to be inseparable from its own inherent dynamics?" The crucial question becomes: what specific features, including the interplay of hybrid event-field interactions, multi-scale coupling without distinct interfaces, and energetic constraints that guide inference and learning, are essential for computation to emerge as an intrinsic property of the system itself, rather than an abstract description layered upon it?
This represents the fundamental paradigm shift advocated by biological computationalism. It redirects the challenge from identifying the correct program to discovering the appropriate kind of computing matter.
