01 logo

brain power

brain

By Harikrishnan Published 11 months ago 3 min read

The human brain is just 2% of the body’s weight, but 20% of its metabolic load (1–3), and 10 times more expensive per gram than muscle. On the other hand, the brain manages to produce poetry, design spacecraft, and create art on an energy budget of ∼

20 W, a paltry sum given that the computer on which this article is being typed requires 80 W. So where in the brain is power consumed, what is it used for, why is it so expensive relative to other costs of living, and how does it achieve its power efficiency relative to engineered silicon? Many classic papers have studied these questions. Attwell and Laughlin (4) developed detailed biophysical estimates suggesting that neural signaling and the postsynaptic effects of neurotransmitter release combined to account for 80% of the brain’s adenosine triphosphate (ATP) consumption, conclusions that are also supported by the overall physiology and anatomy of neural circuits (5, 6). Numerous studies explored the structural and functional consequences of this expenditure for limiting brain size (7) and scaling (8), efficient wiring patterns (9), analog (graded potential) vs. digital (spiking) signaling (10), distributed neural codes (11–13), the distribution of information traffic along nerve tracts and their size distribution (14–16), and computational heterogeneity and efficiency (17). Many of these ideas have been synthesized by Sterling and Laughlin (18) into a set of principles governing the design of brains. Now, in PNAS, Levy and Calvert (19) propose a functional accounting of the power budget of the mammalian brain, suggesting that communication is vastly more expensive than computation, and exploring the functional consequences for neural circuit organization.

Levy and Calvert (19) build on the earlier literature by focusing primarily on the relative power committed to different modes of information processing in the human brain, rather than on different aspects of cellular function. They make a key distinction between communication and computation. “Communication” refers, in general, to the transport of information, perhaps encoded in some way, from one site to another without transformation of the representation, extraction of salient features, or mapping to decisions or outcomes. An example in the brain is the transport of visual information, unchanged, along the optic nerve from the eye to the central brain. “Computation,” a more subtle concept, is generally understood in terms of an input–output transformation. Levy and Calvert, building on previous work of Levy, view each neuron as performing a “microscopic estimation or prediction” of latent variables in its input and encoding this output in interpulse intervals (IPIs), that is, in the relative timing of the action potentials used in signaling by most neurons. Estimation and prediction are sufficiently general frameworks to subsume other views of neural computation (e.g., neurons as dynamical systems or logic gates), and also to encompass the role played by single neurons within networks charged with carrying out a computational function. Likewise, information coding in IPIs includes other possibilities like rate codes and pattern codes as specific cases. Thus, an example of a “computation” by a neuron in the brain could be the signaling by a simple cell in V1 of the presence of a horizontal bar in the visual input.

Employing this perspective, the authors conclude that the biophysical processes supporting communication consume a startling 35 times more power (ATP molecules per second in biologically relevant units, or joules per second in physical units) than the processes supporting computation (19). They come to this conclusion by summing up the ATP cost to recover ionic gradients from the excitatory currents per IPI involved in computation vs. costs such as axonal resting potentials, action potentials, and vesicle recycling involved in communication. This vast difference formalizes findings implicit in ref. 4, whose authors showed that action potentials and their postsynaptic effects dominate power consumption in the brain. In another supporting line of evidence from ref. 15, mitochondrial distributions in neurons track firing rates and synaptic transmission so that the thickness of axons may be largely determined by the need to supply synaptic terminals whose use consumes 65% of the energy budget of the mammalian brain (20). These expensive processes from refs. 4 and 15 are describing communication, not computation, in the framework of ref. 19. Interestingly, Levy and Calvert also estimate that 27% of the cortical power expenditure is spent on costs associated with synaptogenesis, such as growth via actin polymerization, membrane synthesis and incorporation, and associated intracellular transport. This interesting refinement of previous energy budgets suggests that more than a quarter of the energy cost of owning a brain is to facilitate ongoing learning, consistent with our qualitative impression of the purpose of this organ.

book reviews

About the Creator

Harikrishnan

top trendring idea's here

Enjoyed the story?
Support the Creator.

Subscribe for free to receive all their stories in your feed. You could also pledge your support or give them a one-off tip, letting them know you appreciate their work.

Subscribe For Free

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments (1)

  • Harikrishnan S11 months ago

    wooowwwwww

Harikrishnan Written by Harikrishnan

Find us on social media

Miscellaneous links

  • Explore
  • Contact
  • Privacy Policy
  • Terms of Use
  • Support

© 2024 Creatd, Inc. All Rights Reserved.