Sunday, December 22, 2024
HomePhysicsGeneral PhysicsResearchers estimate the energy costs of information processing in biological systems

Researchers estimate the energy costs of information processing in biological systems

Countless biological processes rely on communication between cells and other molecular components to sustain the behaviour, physiology, and existence of living creatures. It is well established that these molecular parts can communicate with one another in a number of different ways, including through diffusion, electrical depolarization, and the exchange of mechanical waves, to name a few.

The energetic cost of this communication between cells and molecular components was recently investigated by researchers at Yale University. Their work, published in Physical Review Letters, presents a novel method for investigating cellular networks.

Benjamin B. Machta, one of the researchers who conducted the study, said to Phys.org, “We have been thinking about this project for a while now in one form or another.”

About a decade ago, when Jim Sethna was my Ph.D. advisor, I began discussing concepts that would eventually become this project. However, for a variety of reasons, the work never really took off. When Sam and I first discussed the topic, we were trying to find out how to calculate the energy costs that biology incurs when computing (a central focus in much of his Ph.D. work) and, maybe more generally, when making sure its pieces are consistent and under control.

Earlier works released in the late ’90s, especially those by Simon Laughlin and his associates, serve as motivation for the latest work by Machta and his colleague Samuel J. Bryant. The study team at the time had attempted to experimentally estimate the energy cost of information transmission in neurons.

According to Machta’s research, “this energy expenditure ranged between 104 and 107 KBT/bit depending on details,” which is “far higher than the ‘fundamental’ bound of KBT/bit,” also known as the Landauer bound, which must be spent to erase a bit of information.

We were curious if this was a case of biology being wasteful, so to speak. Perhaps some other price has to be paid; the Landauer limit, for instance, does not specify any physical or geometric particulars. It is feasible to compute reversibly, never delete anything, and not pay any computing costs; however, that is not the topic here. Applying the Landauer bound is itself subtle, as it is only charged with erasing information.

Machta and Bryant’s new research aimed to learn more about why various molecular systems employ different physical strategies for communicating with one another and whether optimising these energetic costs could provide insight into this mystery. While neurons normally send and receive electrical signals, other types of cells are capable of communicating through the passage of substances.

For the lowest energy cost per bit, “we wanted to understand in what regime each of these (and others) would be best,” Machta explained. To put it another way, “in all our calculations, we consider information that is sent through a physical channel, from a physical sender of information (like a sending’ ion channel that opens and closes to send a signal) to a physical receiver of information (a voltage detector in the membrane, which could also be an ion channel).” At its core, this calculation is a novel take on a standard formula for determining the information rate across a Gaussian channel.

To begin, Machta and his team always assume the existence of a physical channel through which physical particles and electrical charges flow in accordance with the physics of a cell. Second, the group always predicated that thermal noise in the cellular environment would contaminate any channel they worked with.

With the ‘fluctuation dissipation theorem,’ which “relates the spectrum of thermal fluctuations to the near equilibrium response functions,” we can determine the spectrum of this noise, as Machta put it.

The team’s estimations also stand out because they were made with relatively straightforward models. Because of this, scientists could reliably set low constraints on the amount of energy needed to power a channel and generate physical currents in a biological system.

According to Machta, “since the signal must overcome thermal noise, we commonly discover costs with a geometric prefactor doubling “KBT/bit.”

The size of the transmitter and receiver both play a role in this geometric factor, with the former typically lowering per-bit costs thanks to the greater surface area across which dissipative current can be distributed. Furthermore, a larger receiver provides better averaging over thermal fluctuations, which enables the transmission of the same information despite a lower overall signal.

In the case of electrical signalling, for instance, we obtain a scaling form for the cost per bit as follows: r2/IO kBT/bit, where r is the distance between the sender and the receiver, and r2, r8, and r8 are the sender’s and receiver’s respective sizes. This cost could be many orders of magnitude higher than the kT/bit suggested by simpler (or more fundamental) reasons; this is especially relevant for ion channels that are only a few nanometers broad but transmit data over microns.

Machta’s and his colleagues’ calculations show that transmitting information between cells requires a lot of energy. A potential explanation for the high cost of information processing observed in experimental research may lie in their estimates.

Machta said that their explanation was “less fundamental” than the Landauer bound because it relied on specifics like the architecture of neurons and ion channels.” However, if biology is susceptible to these details, then it is possible that, for example, neurons are efficient and up against real information and energy restrictions and are not merely inefficient. While these numbers don’t prove that any one technique is the most effective, they do illustrate that carrying data into space can take a lot of power.

This current breakthrough by Machta and his colleagues has the potential to pave the way for additional exciting biological research in the future. The authors also introduced a “phase diagram” to illustrate when it is best to employ various forms of communication (such as electrical signalling, chemical diffusion, etc.) in their study.

This graphic may soon be useful for gaining insight into the underlying design ideas underlying various cell signalling approaches. For instance, it could explain why E. coli bacteria use diffusion to transmit information about their chemical environment and why neurons use chemical diffusion to communicate at synapses but use electrical signals when sending information over hundreds of microns from dendrites to the cell body.

The energetics of a specific signal transduction system are something that “we are working on now,” Machta said.

Applying our bound requires knowledge of the flow of information throughout a network, which is something we didn’t consider in our recent work because it was focused on the abstract cost of conveying information between two separate components. Applying our computations to specific geometries (such as a spherical’ neuron or an axon that resembles a tube, each significantly different from the endless plain we used here) is a new technological difficulty that arises as a result of this goal.

Reference:

Samuel J. Bryant et al, Physical Constraints in Intracellular Signaling: The Cost of Sending a Bit, Physical Review Letters (2023). DOI: 10.1103/PhysRevLett.131.068401

RELATED ARTICLES

Leave a reply

Please enter your comment!
Please enter your name here

Privacy Policy